Measurement Precision

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 142725 Experts worldwide ranked by ideXlab platform

Jacob Dunningham - One of the best experts on this subject based on the ideXlab platform.

  • Using quantum theory to improve Measurement Precision
    Contemporary Physics, 2006
    Co-Authors: Jacob Dunningham
    Abstract:

    Progress in science is inextricably linked with how well we can observe the world around us. It is by making increasingly better Measurements that scientific theories are tested and refined. In this article we address the question of what the ultimate limit to Measurement Precision is and how it could be achieved in the laboratory. We focus on how, by making use of quantum theory, it is possible to make better Measurements than anything that can be achieved with classical techniques. This opens the door to an array of new technologies and could help answer some of science's most engaging questions.

Tiegen Liu - One of the best experts on this subject based on the ideXlab platform.

  • Impact of intensity integration time distribution on the Measurement Precision of Mueller polarimetry
    Journal of Quantitative Spectroscopy and Radiative Transfer, 2019
    Co-Authors: Tiegen Liu
    Abstract:

    Abstract Mueller polarimetry is an important technology to characterize the physical property of materials (such as thin films and scattering particles etc.). Therefore, the high Measurement Precision is significant for various applications. In this paper, a modified model considering integration time of intensity Measurement is proposed to verify the impact of intensity integration time distribution on the Measurement Precision of Mueller polarimetry. In the presence of additive Gaussian noise, the optimal integration time distribution, which is determined by instrument matrix, that minimizes the noise propagation (and thus estimation variance) is also proposed for both complete Mueller matrix polarimetry and some common partial (or incomplete) Mueller matrix polarimetry. Furthermore, we found that the estimation variance can be further reduced by optimizing the integration time distribution for some given instrument matrices. However, we also verify that, for the first time to our knowledge, the existing minimal estimation variance, and thus the upper bound of Measurement Precision cannot be broken by optimizing intensity integration time distribution.

Guan Xiaoyua - One of the best experts on this subject based on the ideXlab platform.

  • Influence of tidal discharge Measurement time distribution on tidal discharge Measurement Precision in tidal river reaches
    Water Resources Protection, 2013
    Co-Authors: Guan Xiaoyua
    Abstract:

    According to the hydrogeologic properties of the Sutong Bridge location reach of the lower Yangtze River,three types of Measurement time distribution were used to analyze the influence of Measurement time distribution on tidal discharge Measurement Precision for flood and ebb tides.Flood and ebb tidal discharge calculated by continuously observed velocity data were selected as the true value and compared with the discharge calculated based on different Measurement time distributions.The results show that the tidal discharge Measurement Precision improved with the addition of Measurement times.The commonly used Measurement scheme in which the discharge is measured once an hour during ebb tide is more reasonable.Discharge measured every half an hour during flood tide may lead to systematic error.Flood tidal volume is smaller and the sum of tidal volume is larger than the true value,which affects the tidal discharge Measurement Precision and the control of the water resources quantity.It is necessary to correct the systematic error.

Heinz Holling - One of the best experts on this subject based on the ideXlab platform.

  • Scrutinizing the basis of originality in divergent thinking tests: On the Measurement Precision of response propensity estimates.
    The British journal of educational psychology, 2019
    Co-Authors: Boris Forthmann, Sue Hyeon Paek, Denis Dumas, Baptiste Barbot, Heinz Holling
    Abstract:

    BACKGROUND The originality of divergent thinking (DT) production is one of the most critical indicators of creative potential. It is commonly scored using the statistical infrequency of responses relative to all responses provided in a given sample. AIMS Response frequency estimates vary in terms of Measurement Precision. This issue has been widely overlooked and is addressed in the current study. SAMPLE AND METHOD Secondary data analysis of 202 participants was performed. A total of 900 uniquely identified responses were generated on three DT tasks and subjected to a 1-parameter logistic model with a response as the unit of Measurement which allowed for the calculation of response-level conditional reliability (and marginal reliability as an overall summary of Measurement Precision). RESULTS Marginal reliability of response propensity estimates ranged from .62 to .67 across the DT tasks. Unique responses in the sample (the basis for the classic uniqueness scoring) displayed the lowest conditional reliability (across tasks: ≈ .50). Reliability increased nonlinearly as a function of both the frequency of occurrence predicted by the model (conditional reliability) and sample size (conditional and marginal reliability). CONCLUSIONS This study indicates that the common practice of frequency-based originality scoring with typical sample sizes (e.g., N = 100 to N = 200) yields unacceptable levels of Measurement Precision (i.e., in particular for highly original responses). We further offer recommendations to mitigate the lack of Measurement Precision of frequency-based originality scores for DT research.

Wolfgang Dür - One of the best experts on this subject based on the ideXlab platform.

  • Detecting Large Quantum Fisher Information with Finite Measurement Precision.
    Physical review letters, 2016
    Co-Authors: Florian Fröwis, Pavel Sekatski, Wolfgang Dür
    Abstract:

    We propose an experimentally accessible scheme to determine the lower bounds on the quantum Fisher information (QFI), which ascertains multipartite entanglement or usefulness for quantum metrology. The scheme is based on comparing the Measurement statistics of a state before and after a small unitary rotation. We argue that, in general, the limited resolution of collective observables prevents the detection of large QFI. This can be overcome by performing an additional operation prior to the Measurement. We illustrate the power of this protocol for present-day spin-squeezing experiments, where the same operation used for the preparation of the initial spin-squeezed state improves also the Measurement Precision and hence the lower bound on the QFI by 2 orders of magnitude. We also establish a connection to the Leggett-Garg inequalities. We show how to simulate a variant of the inequalities with our protocol and demonstrate that large QFI is necessary for their violation with coarse-grained detectors.