Asymptotic Result

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 79503 Experts worldwide ranked by ideXlab platform

Ryo Hayakawa - One of the best experts on this subject based on the ideXlab platform.

  • error analysis of douglas rachford algorithm for linear inverse problems Asymptotics of proximity operator for squared loss
    arXiv: Signal Processing, 2021
    Co-Authors: Ryo Hayakawa
    Abstract:

    Proximal splitting-based convex optimization is a promising approach to linear inverse problems because we can use some prior knowledge of the unknown variables explicitly. In this paper, we firstly analyze the Asymptotic property of the proximity operator for the squared loss function, which appears in the update equations of some proximal splitting methods for linear inverse problems. The analysis shows that the output of the proximity operator can be characterized with a scalar random variable in the large system limit. Moreover, we investigate the Asymptotic behavior of the Douglas-Rachford algorithm, which is one of the famous proximal splitting methods. From the Asymptotic Result, we can predict the evolution of the mean-square-error (MSE) in the algorithm for large-scale linear inverse problems. Simulation Results demonstrate that the MSE performance of the Douglas-Rachford algorithm can be well predicted by the analytical Result in compressed sensing with the $\ell_{1}$ optimization.

  • noise variance estimation using Asymptotic residual in compressed sensing
    arXiv: Signal Processing, 2020
    Co-Authors: Ryo Hayakawa
    Abstract:

    In compressed sensing, the measurement is usually contaminated by additive noise, and hence the information of the noise variance is often required to design algorithms. In this paper, we propose an estimation method for the unknown noise variance in compressed sensing problems. The proposed method called Asymptotic residual matching (ARM) estimates the noise variance from a single measurement vector on the basis of the Asymptotic Result for the $\ell_{1}$ optimization problem. Specifically, we derive the Asymptotic residual corresponding to the $\ell_{1}$ optimization and show that it depends on the noise variance. The proposed ARM approach obtains the estimate by comparing the Asymptotic residual with the actual one, which can be obtained by the empirical reconstruction without the information of the noise variance. Simulation Results show that the proposed noise variance estimation outperforms a conventional method based on the analysis of the ridge regularized least squares. We also show that, by using the proposed method, we can achieve good reconstruction performance in compressed sensing even when the noise variance is unknown.

Igor Prunster - One of the best experts on this subject based on the ideXlab platform.

  • bayesian non parametric inference for species variety with a two parameter poisson dirichlet process prior
    Journal of The Royal Statistical Society Series B-statistical Methodology, 2009
    Co-Authors: Stefano Favaro, Antonio Lijoi, Ramses H Mena, Igor Prunster
    Abstract:

    Summary.  A Bayesian non-parametric methodology has been recently proposed to deal with the issue of prediction within species sampling problems. Such problems concern the evaluation, conditional on a sample of size n, of the species variety featured by an additional sample of size m. Genomic applications pose the additional challenge of having to deal with large values of both n and m. In such a case the computation of the Bayesian non-parametric estimators is cumbersome and prevents their implementation. We focus on the two-parameter Poisson–Dirichlet model and provide completely explicit expressions for the corresponding estimators, which can be easily evaluated for any sizes of n and m. We also study the Asymptotic behaviour of the number of new species conditionally on the observed sample: such an Asymptotic Result, combined with a suitable simulation scheme, allows us to derive Asymptotic highest posterior density intervals for the estimates of interest. Finally, we illustrate the implementation of the proposed methodology by the analysis of five expressed sequence tags data sets.

  • bayesian non parametric inference for species variety with a two parameter poisson dirichlet process prior
    Journal of The Royal Statistical Society Series B-statistical Methodology, 2009
    Co-Authors: Stefano Favaro, Antonio Lijoi, Ramses H Mena, Igor Prunster
    Abstract:

    A Bayesian non-parametric methodology has been recently proposed to deal with the issue of prediction within species sampling problems. Such problems concern the evaluation, conditional on a sample of size "n", of the species variety featured by an additional sample of size "m". Genomic applications pose the additional challenge of having to deal with large values of both "n" and "m". In such a case the computation of the Bayesian non-parametric estimators is cumbersome and prevents their implementation. We focus on the two-parameter Poisson-Dirichlet model and provide completely explicit expressions for the corresponding estimators, which can be easily evaluated for any sizes of "n" and "m". We also study the Asymptotic behaviour of the number of new species conditionally on the observed sample: such an Asymptotic Result, combined with a suitable simulation scheme, allows us to derive Asymptotic highest posterior density intervals for the estimates of interest. Finally, we illustrate the implementation of the proposed methodology by the analysis of five expressed sequence tags data sets. Copyright (c) 2009 Royal Statistical Society.

Ryan J Tibshirani - One of the best experts on this subject based on the ideXlab platform.

  • uniform Asymptotic inference and the bootstrap after model selection
    Annals of Statistics, 2018
    Co-Authors: Ryan J Tibshirani, Alessandro Rinaldo, Larry Wasserman
    Abstract:

    Recently, Tibshirani et al. (2014) developed a method for making inferences on parameters after model selection, in a regression setting with normally distributed errors. In this work, we study the large sample properties of this method, without assuming normality. We prove that the test statistic of Tibshirani et al. (2014) is Asymptotically pivotal, as the number of samples n grows and the dimension d of the regression problem stays fixed; our Asymptotic Result is uniformly valid over a wide class of nonnormal error distributions. We also propose an efficient bootstrap version of this test that is provably (Asymptotically) conservative, and in practice, often delivers shorter confidence intervals that the original normality-based approach. Finally, we prove that the test statistic of Tibshirani et al. (2014) does not converge uniformly in a high-dimensional setting, when the dimension d is allowed grow.

  • uniform Asymptotic inference and the bootstrap after model selection
    arXiv: Statistics Theory, 2015
    Co-Authors: Ryan J Tibshirani, Alessandro Rinaldo, Larry Wasserman
    Abstract:

    Recently, Tibshirani et al. (2016) proposed a method for making inferences about parameters defined by model selection, in a typical regression setting with normally distributed errors. Here, we study the large sample properties of this method, without assuming normality. We prove that the test statistic of Tibshirani et al. (2016) is Asymptotically valid, as the number of samples n grows and the dimension d of the regression problem stays fixed. Our Asymptotic Result holds uniformly over a wide class of nonnormal error distributions. We also propose an efficient bootstrap version of this test that is provably (Asymptotically) conservative, and in practice, often delivers shorter intervals than those from the original normality-based approach. Finally, we prove that the test statistic of Tibshirani et al. (2016) does not enjoy uniform validity in a high-dimensional setting, when the dimension d is allowed grow.

Alexandru Zaharescu - One of the best experts on this subject based on the ideXlab platform.

  • twisted second moments of the riemann zeta function and applications
    Journal of Mathematical Analysis and Applications, 2016
    Co-Authors: Nicolas Robles, Arindam Roy, Alexandru Zaharescu
    Abstract:

    Abstract In order to compute a twisted second moment of the Riemann zeta-function, two different mollifiers, each being a combination of two different Dirichlet polynomials, were introduced separately by Bui, Conrey, and Young, and by Feng. In this article we introduce a mollifier which is a combination of four Dirichlet polynomials of different shapes. We provide an Asymptotic Result for the twisted second moment of ζ ( s ) for such choice of mollifier. A small increment on the percentage of zeros of the Riemann zeta-function on the critical line is given as an application of our Results.

Larry Wasserman - One of the best experts on this subject based on the ideXlab platform.

  • uniform Asymptotic inference and the bootstrap after model selection
    Annals of Statistics, 2018
    Co-Authors: Ryan J Tibshirani, Alessandro Rinaldo, Larry Wasserman
    Abstract:

    Recently, Tibshirani et al. (2014) developed a method for making inferences on parameters after model selection, in a regression setting with normally distributed errors. In this work, we study the large sample properties of this method, without assuming normality. We prove that the test statistic of Tibshirani et al. (2014) is Asymptotically pivotal, as the number of samples n grows and the dimension d of the regression problem stays fixed; our Asymptotic Result is uniformly valid over a wide class of nonnormal error distributions. We also propose an efficient bootstrap version of this test that is provably (Asymptotically) conservative, and in practice, often delivers shorter confidence intervals that the original normality-based approach. Finally, we prove that the test statistic of Tibshirani et al. (2014) does not converge uniformly in a high-dimensional setting, when the dimension d is allowed grow.

  • uniform Asymptotic inference and the bootstrap after model selection
    arXiv: Statistics Theory, 2015
    Co-Authors: Ryan J Tibshirani, Alessandro Rinaldo, Larry Wasserman
    Abstract:

    Recently, Tibshirani et al. (2016) proposed a method for making inferences about parameters defined by model selection, in a typical regression setting with normally distributed errors. Here, we study the large sample properties of this method, without assuming normality. We prove that the test statistic of Tibshirani et al. (2016) is Asymptotically valid, as the number of samples n grows and the dimension d of the regression problem stays fixed. Our Asymptotic Result holds uniformly over a wide class of nonnormal error distributions. We also propose an efficient bootstrap version of this test that is provably (Asymptotically) conservative, and in practice, often delivers shorter intervals than those from the original normality-based approach. Finally, we prove that the test statistic of Tibshirani et al. (2016) does not enjoy uniform validity in a high-dimensional setting, when the dimension d is allowed grow.