Simulated Data Set

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Ethan M Berke - One of the best experts on this subject based on the ideXlab platform.

  • identifying optimal covid 19 testing strategies for schools and businesses balancing testing frequency individual test technology and cost
    PLOS ONE, 2021
    Co-Authors: Gregory Lyng, Natalie E Sheils, Caleb J Kennedy, Daniel O Griffin, Ethan M Berke
    Abstract:

    BACKGROUND COVID-19 test sensitivity and specificity have been widely examined and discussed, yet optimal use of these tests will depend on the goals of testing, the population or Setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. METHODS We coupled a Simulated Data Set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, symptom checking, and test cost on outcomes including case reduction and false positives. FINDINGS Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. INTERPRETATION A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.

  • identifying optimal covid 19 testing strategies for schools and businesses balancing testing frequency individual test technology and cost
    medRxiv, 2020
    Co-Authors: Gregory Lyng, Natalie E Sheils, Caleb J Kennedy, Daniel O Griffin, Ethan M Berke
    Abstract:

    Background: COVID-19 test sensitivity and specificity have been widely examined and discussed yet optimal use of these tests will depend on the goals of testing, the population or Setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. Methods: We coupled a Simulated Data Set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, and test cost on outcomes case reduction. Results: Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. Conclusions: A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.

Gregory Lyng - One of the best experts on this subject based on the ideXlab platform.

  • identifying optimal covid 19 testing strategies for schools and businesses balancing testing frequency individual test technology and cost
    PLOS ONE, 2021
    Co-Authors: Gregory Lyng, Natalie E Sheils, Caleb J Kennedy, Daniel O Griffin, Ethan M Berke
    Abstract:

    BACKGROUND COVID-19 test sensitivity and specificity have been widely examined and discussed, yet optimal use of these tests will depend on the goals of testing, the population or Setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. METHODS We coupled a Simulated Data Set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, symptom checking, and test cost on outcomes including case reduction and false positives. FINDINGS Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. INTERPRETATION A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.

  • identifying optimal covid 19 testing strategies for schools and businesses balancing testing frequency individual test technology and cost
    medRxiv, 2020
    Co-Authors: Gregory Lyng, Natalie E Sheils, Caleb J Kennedy, Daniel O Griffin, Ethan M Berke
    Abstract:

    Background: COVID-19 test sensitivity and specificity have been widely examined and discussed yet optimal use of these tests will depend on the goals of testing, the population or Setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. Methods: We coupled a Simulated Data Set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, and test cost on outcomes case reduction. Results: Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. Conclusions: A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.

F V N Rangarajan - One of the best experts on this subject based on the ideXlab platform.

  • asmooth a simple and efficient algorithm for adaptive kernel smoothing of two dimensional imaging Data
    Monthly Notices of the Royal Astronomical Society, 2006
    Co-Authors: H Ebeling, D A White, F V N Rangarajan
    Abstract:

    An efficient algorithm for adaptive kernel smoothing (AKS) of two-dimensional imaging Data has been developed and implemented using the Interactive Data Language (idl). The functional form of the kernel can be varied (top-hat, Gaussian, etc.) to allow different weighting of the event counts registered within the smoothing region. For each individual pixel, the algorithm increases the smoothing scale until the signal-to-noise ratio (S/N) within the kernel reaches a pre-Set value. Thus, noise is suppressed very efficiently, while at the same time real structure, that is, signal that is locally significant at the selected S/N level, is preserved on all scales. In particular, extended features in noise-dominated regions are visually enhanced. The asmooth algorithm differs from other AKS routines in that it allows a quantitative assessment of the goodness of the local signal estimation by producing adaptively smoothed images in which all pixel values share the same S/N above the background. We apply asmooth to both real observational Data (an X-ray image of clusters of galaxies obtained with the Chandra X-ray Observatory) and to a Simulated Data Set. We find the asmoothed images to be fair representations of the input Data in the sense that the residuals are consistent with pure noise, that is, they possess Poissonian variance and a near-Gaussian distribution around a mean of zero, and are spatially uncorrelated.

  • asmooth a simple and efficient algorithm for adaptive kernel smoothing of two dimensional imaging Data
    arXiv: Astrophysics, 2006
    Co-Authors: H Ebeling, D A White, F V N Rangarajan
    Abstract:

    An efficient algorithm for adaptive kernel smoothing (AKS) of two-dimensional imaging Data has been developed and implemented using the Interactive Data Language (IDL). The functional form of the kernel can be varied (top-hat, Gaussian etc.) to allow different weighting of the event counts registered within the smoothing region. For each individual pixel the algorithm increases the smoothing scale until the signal-to-noise ratio (s.n.r.) within the kernel reaches a preSet value. Thus, noise is suppressed very efficiently, while at the same time real structure, i.e. signal that is locally significant at the selected s.n.r. level, is preserved on all scales. In particular, extended features in noise-dominated regions are visually enhanced. The ASMOOTH algorithm differs from other AKS routines in that it allows a quantitative assessment of the goodness of the local signal estimation by producing adaptively smoothed images in which all pixel values share the same signal-to-noise ratio above the background. We apply ASMOOTH to both real observational Data (an X-ray image of clusters of galaxies obtained with the Chandra X-ray Observatory) and to a Simulated Data Set. We find the ASMOOTHed images to be fair representations of the input Data in the sense that the residuals are consistent with pure noise, i.e. they possess Poissonian variance and a near-Gaussian distribution around a mean of zero, and are spatially uncorrelated.

Daniel O Griffin - One of the best experts on this subject based on the ideXlab platform.

  • identifying optimal covid 19 testing strategies for schools and businesses balancing testing frequency individual test technology and cost
    PLOS ONE, 2021
    Co-Authors: Gregory Lyng, Natalie E Sheils, Caleb J Kennedy, Daniel O Griffin, Ethan M Berke
    Abstract:

    BACKGROUND COVID-19 test sensitivity and specificity have been widely examined and discussed, yet optimal use of these tests will depend on the goals of testing, the population or Setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. METHODS We coupled a Simulated Data Set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, symptom checking, and test cost on outcomes including case reduction and false positives. FINDINGS Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. INTERPRETATION A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.

  • identifying optimal covid 19 testing strategies for schools and businesses balancing testing frequency individual test technology and cost
    medRxiv, 2020
    Co-Authors: Gregory Lyng, Natalie E Sheils, Caleb J Kennedy, Daniel O Griffin, Ethan M Berke
    Abstract:

    Background: COVID-19 test sensitivity and specificity have been widely examined and discussed yet optimal use of these tests will depend on the goals of testing, the population or Setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. Methods: We coupled a Simulated Data Set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, and test cost on outcomes case reduction. Results: Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. Conclusions: A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.

Caleb J Kennedy - One of the best experts on this subject based on the ideXlab platform.

  • identifying optimal covid 19 testing strategies for schools and businesses balancing testing frequency individual test technology and cost
    PLOS ONE, 2021
    Co-Authors: Gregory Lyng, Natalie E Sheils, Caleb J Kennedy, Daniel O Griffin, Ethan M Berke
    Abstract:

    BACKGROUND COVID-19 test sensitivity and specificity have been widely examined and discussed, yet optimal use of these tests will depend on the goals of testing, the population or Setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. METHODS We coupled a Simulated Data Set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, symptom checking, and test cost on outcomes including case reduction and false positives. FINDINGS Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. INTERPRETATION A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.

  • identifying optimal covid 19 testing strategies for schools and businesses balancing testing frequency individual test technology and cost
    medRxiv, 2020
    Co-Authors: Gregory Lyng, Natalie E Sheils, Caleb J Kennedy, Daniel O Griffin, Ethan M Berke
    Abstract:

    Background: COVID-19 test sensitivity and specificity have been widely examined and discussed yet optimal use of these tests will depend on the goals of testing, the population or Setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. Methods: We coupled a Simulated Data Set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, and test cost on outcomes case reduction. Results: Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. Conclusions: A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.