Scan Science and Technology

Contact Leading Edge Experts & Companies

14,000,000 Leading Edge Experts on the ideXlab platform

14,000,000

Leading Edge Experts

on the ideXlab platform

Covariance

The Experts below are selected from a list of 392418 Experts worldwide ranked by ideXlab platform

Covariance - Free Register to Access Experts & Abstracts

Ivor J. A. Simpson - One of the best experts on this subject based on the ideXlab platform.

  • CVPR - Structured Uncertainty Prediction Networks
    2018 IEEE CVF Conference on Computer Vision and Pattern Recognition, 2018
    Co-Authors: Garoe Dorta, Sara Vicente, Lourdes Agapito, Neill D. F. Campbell, Ivor J. A. Simpson
    Abstract:

    This paper is the first work to propose a network to predict a structured uncertainty distribution for a synthesized image. Previous approaches have been mostly limited to predicting diagonal Covariance matrices [15]. Our novel model learns to predict a full Gaussian Covariance matrix for each reconstruction, which permits efficient sampling and likelihood evaluation. We demonstrate that our model can accurately reconstruct ground truth correlated residual distributions for synthetic datasets and generate plausible high frequency samples for real face images. We also illustrate the use of these predicted Covariances for structure preserving image denoising.

  • Structured Uncertainty Prediction Networks
    arXiv: Machine Learning, 2018
    Co-Authors: Garoe Dorta, Sara Vicente, Lourdes Agapito, Neill D. F. Campbell, Ivor J. A. Simpson
    Abstract:

    This paper is the first work to propose a network to predict a structured uncertainty distribution for a synthesized image. Previous approaches have been mostly limited to predicting diagonal Covariance matrices. Our novel model learns to predict a full Gaussian Covariance matrix for each reconstruction, which permits efficient sampling and likelihood evaluation. We demonstrate that our model can accurately reconstruct ground truth correlated residual distributions for synthetic datasets and generate plausible high frequency samples for real face images. We also illustrate the use of these predicted Covariances for structure preserving image denoising.

Tim Eifler - One of the best experts on this subject based on the ideXlab platform.

  • precision matrix expansion efficient use of numerical simulations in estimating errors on cosmological parameters
    Monthly Notices of the Royal Astronomical Society, 2018
    Co-Authors: Oliver Friedrich, Tim Eifler
    Abstract:

    Computing the inverse Covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed Covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating Covariances from numerical simulations improves on these approximations, but the sample Covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a Covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the Covariance matrix is the sum of two contributions, C = A + B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard Covariance estimator would require >10^5 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.

  • Precision matrix expansion – efficient use of numerical simulations in estimating errors on cosmological parameters
    Monthly Notices of the Royal Astronomical Society, 2017
    Co-Authors: Oliver Friedrich, Tim Eifler
    Abstract:

    Computing the inverse Covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed Covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating Covariances from numerical simulations improves on these approximations, but the sample Covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a Covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the Covariance matrix is the sum of two contributions, C = A + B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard Covariance estimator would require >10^5 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.

Garoe Dorta - One of the best experts on this subject based on the ideXlab platform.

  • CVPR - Structured Uncertainty Prediction Networks
    2018 IEEE CVF Conference on Computer Vision and Pattern Recognition, 2018
    Co-Authors: Garoe Dorta, Sara Vicente, Lourdes Agapito, Neill D. F. Campbell, Ivor J. A. Simpson
    Abstract:

    This paper is the first work to propose a network to predict a structured uncertainty distribution for a synthesized image. Previous approaches have been mostly limited to predicting diagonal Covariance matrices [15]. Our novel model learns to predict a full Gaussian Covariance matrix for each reconstruction, which permits efficient sampling and likelihood evaluation. We demonstrate that our model can accurately reconstruct ground truth correlated residual distributions for synthetic datasets and generate plausible high frequency samples for real face images. We also illustrate the use of these predicted Covariances for structure preserving image denoising.

  • Structured Uncertainty Prediction Networks
    arXiv: Machine Learning, 2018
    Co-Authors: Garoe Dorta, Sara Vicente, Lourdes Agapito, Neill D. F. Campbell, Ivor J. A. Simpson
    Abstract:

    This paper is the first work to propose a network to predict a structured uncertainty distribution for a synthesized image. Previous approaches have been mostly limited to predicting diagonal Covariance matrices. Our novel model learns to predict a full Gaussian Covariance matrix for each reconstruction, which permits efficient sampling and likelihood evaluation. We demonstrate that our model can accurately reconstruct ground truth correlated residual distributions for synthetic datasets and generate plausible high frequency samples for real face images. We also illustrate the use of these predicted Covariances for structure preserving image denoising.

Oliver Friedrich - One of the best experts on this subject based on the ideXlab platform.

  • precision matrix expansion efficient use of numerical simulations in estimating errors on cosmological parameters
    Monthly Notices of the Royal Astronomical Society, 2018
    Co-Authors: Oliver Friedrich, Tim Eifler
    Abstract:

    Computing the inverse Covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed Covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating Covariances from numerical simulations improves on these approximations, but the sample Covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a Covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the Covariance matrix is the sum of two contributions, C = A + B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard Covariance estimator would require >10^5 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.

  • Precision matrix expansion – efficient use of numerical simulations in estimating errors on cosmological parameters
    Monthly Notices of the Royal Astronomical Society, 2017
    Co-Authors: Oliver Friedrich, Tim Eifler
    Abstract:

    Computing the inverse Covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed Covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating Covariances from numerical simulations improves on these approximations, but the sample Covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a Covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the Covariance matrix is the sum of two contributions, C = A + B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard Covariance estimator would require >10^5 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.

Fabien Lacasa - One of the best experts on this subject based on the ideXlab platform.

  • The impact of braiding Covariance and in-survey Covariance on next-generation galaxy surveys
    Astronomy & Astrophysics, 2020
    Co-Authors: Fabien Lacasa
    Abstract:

    As galaxy surveys become more precise and push to smaller scales, the need for accurate Covariances beyond the classical Gaussian formula becomes more acute. Here, I investigate the analytical implementation and impact of non-Gaussian Covariance terms that I previously derived for galaxy clustering. Braiding Covariance is such a class of terms and it gets contribution both from in-survey and super-survey modes. I present an approximation for braiding Covariance which speeds up the numerical computation. I show that including braiding Covariance is a necessary condition for including other non-Gaussian terms: the in-survey 2-, 3- and 4-halo Covariance, which yield Covariance matrices with negative eigenvalues if considered on their own. I then quantify the impact on parameter constraints, with forecasts for a Euclid-like survey. Compared to the Gaussian case, braiding and in-survey Covariances significantly increase the error bars on cosmological parameters, in particular by 50% for w. The Halo Occupation Distribution (HOD) error bars are also affected between 12% and 39%. Accounting for super-sample Covariance (SSC) also increases parameter errors, by 90% for w and between 7% and 64% for HOD. In total, non-Gaussianity increases the error bar on w by 120% (between 15% and 80% for other cosmological parameters), and the error bars on HOD parameters between 17% and 85%. Accounting for the 1-halo trispectrum term on top of SSC is not sufficient for capturing the full non-Gaussian impact: braiding and the rest of in-survey Covariance have to be accounted for. Finally, I discuss why the inclusion of non-Gaussianity generally eases up parameter degeneracies, making cosmological constraints more robust to astrophysical uncertainties. The data and a Python notebook reproducing the results and plots of the article are available at \url{this https URL}. [Abridged]

  • The impact of braiding Covariance and in-survey Covariance on next-generation galaxy surveys
    Astron.Astrophys., 2020
    Co-Authors: Fabien Lacasa
    Abstract:

    As galaxy surveys improve their precision thanks to lower levels of noise and the push toward small, non-linear scales, the need for accurate Covariances beyond the classical Gaussian formula becomes more acute. Here I investigate the analytical implementation and impact of non-Gaussian Covariance terms that I had previously derived for the galaxy angular power spectrum. Braiding Covariance is such an interesting class of such terms and it gets contributions both from in-survey and super-survey modes, the latter proving difficult to calibrate through simulations. I present an approximation for braiding Covariance which speeds up the process of numerical computation. I show that including braiding Covariance is a necessary condition for including other non-Gaussian terms, namely the in-survey 2-, 3-, and 4-halo Covariance. Indeed these terms yield incorrect Covariance matrices with negative eigenvalues if considered on their own. I then move to quantify the impact on parameter constraints, with forecasts for a survey with Euclid-like galaxy density and angular scales. Compared with the Gaussian case, braiding and in-survey Covariances significantly increase the error bars on cosmological parameters, in particular by 50% for the dark energy equation of state w. The error bars on the halo occupation distribution (HOD) parameters are also affected between 12% and 39%. Accounting for super-sample Covariance (SSC) also increases parameter errors, by 90% for w and between 7% and 64% for HOD. In total, non-Gaussianity increases the error bar on w by 120% (between 15% and 80% for other cosmological parameters) and the error bars on HOD parameters between 17% and 85%. Accounting for the 1-halo trispectrum term on top of SSC, as has been done in some current analyses, is not sufficient for capturing the full non-Gaussian impact: braiding and the rest of in-survey Covariance have to be accounted for. Finally, I discuss why the inclusion of non-Gaussianity generally eases up parameter degeneracies, making cosmological constraints more robust for astrophysical uncertainties. I released publicly the data and a Python notebook reproducing the results and plots of the article.Key words: large-scale structure of Universe / methods: analytical / galaxies: statistics⋆ The data and the Python notebook are available at https://github.com/fabienlacasa/BraidingArticle

  • Braiding Covariance, in-survey Covariance, and their impact on next-gen galaxy surveys
    arXiv: Cosmology and Nongalactic Astrophysics, 2019
    Co-Authors: Fabien Lacasa
    Abstract:

    As galaxy surveys become more precise and push to smaller scales, the need for accurate Covariances beyond the vanilla Gaussian formula becomes more acute. Here, I investigate the analytical implementation and impact of non-Gaussian Covariance terms that I uncovered for galaxy clustering (Lacasa 2018). Braiding Covariance is such a new class, that gets contribution both from in-survey and super-survey modes. I present an approximation to Braiding Covariance making it fast to compute numerically. I show that accounting for Braiding Covariance is necessary to include other non-Gaussian terms: the in-survey 2-, 3- and 4-halo Covariance, which quantify coupling between large and small scales, to ensure the positivity of the Covariance matrix. I then quantify the impact on parameter constraints, with forecasts for a Euclid-like survey. Compared to the Gaussian case, Braiding and in-survey Covariances significantly increase the error bars on all cosmological parameters of the wCDM model, in particular by 50% for w. The error bars on Halo Occupation Distribution (HOD) parameters are also affected between 12% and 39%. Accounting for super-sample Covariance (SSC) also increases parameter errors, by 90% for w and between 7% and 64% for HOD. In total, non-Gaussianity increases the error bar on w by 120% (between 15% and 80% for other cosmological parameters), and the HOD error bars between 17% and 85%. Accounting for the 1-halo trispectrum term on top of SSC is not sufficient to capture the full non-Gaussian impact: Braiding and the rest of in-survey Covariance have to be accounted for. I finally discuss why the inclusion of non-Gaussianity generally eases up parameter degeneracies, making cosmological constraints more robust to astrophysical uncertainties. Data and a notebook reproducing all plots and results are available at \url{this https URL} [Abridged]