Nonparametric Regression

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 37050 Experts worldwide ranked by ideXlab platform

Harrison H. Zhou - One of the best experts on this subject based on the ideXlab platform.

  • Nonparametric Regression in natural exponential families
    Institute of Mathematical Statistics Collections, 2010
    Co-Authors: T. Toni Cai, Harrison H. Zhou
    Abstract:

    Theory and methodology for Nonparametric Regression have been particularly well developed in the case of additive homoscedastic Gaussian noise. Inspired by asymptotic equivalence theory, there have been ongoing efforts in recent years to construct explicit procedures that turn other function estimation problems into a standard Nonparametric Regression with Gaussian noise. Then in principle any good Gaussian Nonparametric Regression method can be used to solve those more complicated Nonparametric models. In particular, Brown, Cai and Zhou [3] considered Nonparametric Regression in natural exponential families with a quadratic variance function. In this paper we extend the scope of Brown, Cai and Zhou [3] to general natural exponential families by introducing a new explicit procedure that is based on the variance stabilizing transformation. The new approach significantly reduces the bias of the inverse transformation and as a consequence it enables the method to be applicable to a wider class of exponential families. Combining this procedure with a wavelet block thresholding estimator for Gaussian Nonparametric Regression, we show that the resulting estimator enjoys a high degree of adaptivity and spatial adaptivity with near-optimal asymptotic performance over a broad range of Besov spaces.

  • Nonparametric Regression in Exponential Families
    The Annals of Statistics, 2010
    Co-Authors: Lawrence D. Brown, T. Tony Cai, Harrison H. Zhou
    Abstract:

    Most results in Nonparametric Regression theory are developed only for the case of additive noise. In such a setting many smoothing techniques including wavelet thresholding methods have been developed and shown to be highly adaptive. In this paper we consider Nonparametric Regression in exponential families with the main focus on the natural exponential families with a quadratic variance function, which include, for example, Poisson Regression, binomial Regression and gamma Regression. We propose a unified approach of using a mean-matching variance stabilizing transformation to turn the relatively complicated problem of Nonparametric Regression in exponential families into a standard homoscedastic Gaussian Regression problem. Then in principle any good Nonparametric Gaussian Regression procedure can be applied to the transformed data. To illustrate our general methodology, in this paper we use wavelet block thresholding to construct the final estimators of the Regression function. The procedures are easily implementable. Both theoretical and numerical properties of the estimators are investigated. The estimators are shown to enjoy a high degree of adaptivity and spatial adaptivity with near-optimal asymptotic performance over a wide range of Besov spaces. The estimators also perform well numerically.

Lawrence D. Brown - One of the best experts on this subject based on the ideXlab platform.

  • Nonparametric Regression in Exponential Families
    The Annals of Statistics, 2010
    Co-Authors: Lawrence D. Brown, T. Tony Cai, Harrison H. Zhou
    Abstract:

    Most results in Nonparametric Regression theory are developed only for the case of additive noise. In such a setting many smoothing techniques including wavelet thresholding methods have been developed and shown to be highly adaptive. In this paper we consider Nonparametric Regression in exponential families with the main focus on the natural exponential families with a quadratic variance function, which include, for example, Poisson Regression, binomial Regression and gamma Regression. We propose a unified approach of using a mean-matching variance stabilizing transformation to turn the relatively complicated problem of Nonparametric Regression in exponential families into a standard homoscedastic Gaussian Regression problem. Then in principle any good Nonparametric Gaussian Regression procedure can be applied to the transformed data. To illustrate our general methodology, in this paper we use wavelet block thresholding to construct the final estimators of the Regression function. The procedures are easily implementable. Both theoretical and numerical properties of the estimators are investigated. The estimators are shown to enjoy a high degree of adaptivity and spatial adaptivity with near-optimal asymptotic performance over a wide range of Besov spaces. The estimators also perform well numerically.

  • Asymptotic equivalence theory for Nonparametric Regression with random design
    The Annals of Statistics, 2002
    Co-Authors: Lawrence D. Brown, T. Tony Cai, Mark G. Low, Cun-hui Zhang
    Abstract:

    This paper establishes the global asymptotic equivalence between the Nonparametric Regression with random design and the white noise under sharp smoothness conditions on an unknown Regression or drift function. The asymptotic equivalence is established by constructing explicit equivalence mappings between the Nonparametric Regression and the whitenoise experiments, which provide synthetic observations and synthetic asymptotic solutions from any one of the two experiments with asymptotic properties identical to the true observations and given asymptotic solutions from the other. The impact of such asymptotic equivalence results is that an investigation in one Nonparametric problem automatically yields asymptotically analogous results in all other asymptotically equivalent Nonparametric problems. 1. Introduction. The purpose of this paper is to establish the global asymptotic equivalence between the Nonparametric Regression with random design and the white noise under sharp smoothness conditions on an unknown Regression or drift function. We establish this asymptotic equivalence by constructing explicit equivalence mappings between the Nonparametric Regression and the white-noise problems, as in Brown and Low (1996) for their asymptotic equivalence results. The equivalence mapping from the Nonparametric Regression to the white noise provides synthetic observations of the white noise from the Nonparametric Regression such that the distributions of the synthetic observations are asymptotically equivalent to those of the true observations of the white noise. For any asymptotic solution to a white-noise problem, the application of the solution to the synthetic observations provides an asymptotic solution to the corresponding Nonparametric Regression problem with identical asymptotic properties. Likewise, the equivalence mapping from the white noise produces synthetic observations of the Nonparametric Regression problem and synthetic asymptotic solutions to white-noise problems based on those of the corresponding Nonparametric Regression problems. The impact of such asymptotic equivalence results is that an investigation in one Nonparametric problem automatically yields asymptotically analogous results in

Marten Wegkamp - One of the best experts on this subject based on the ideXlab platform.

  • Quantization for Nonparametric Regression
    IEEE Transactions on Information Theory, 2008
    Co-Authors: László Györfi, Marten Wegkamp
    Abstract:

    The authors discuss quantization or clustering of Nonparametric Regression estimates. The main tools developed are oracle inequalities for the rate of convergence of constrained least squares estimates. These inequalities yield fast rates for both Nonparametric (unconstrained) least squares Regression and clustering of partition Regression estimates and plug-in empirical quantizers. The bounds on the rate of convergence generalize known results for bounded errors to subGaussian, too.

  • model selection in Nonparametric Regression
    Annals of Statistics, 2003
    Co-Authors: Marten Wegkamp
    Abstract:

    Model selection using a penalized data-splitting device is studied in the context of Nonparametric Regression. Finite sample bounds under mild conditions are obtained. The resulting estimates are adaptive for large classes of functions.

K. B. Kulasekera - One of the best experts on this subject based on the ideXlab platform.

  • Variable selection by stepwise slicing in Nonparametric Regression
    Statistics & Probability Letters, 2001
    Co-Authors: K. B. Kulasekera
    Abstract:

    We consider variable selection issue in a Nonparametric Regression setting. Two stepwise procedures based on variance estimators are proposed for selecting the significant variables in a general Nonparametric Regression model. These procedures do not require multidimensional smoothing at intermediate steps and they are based on formal tests of hypotheses as opposed to existing methods in the literature. Asymptotic properties are examined and empirical results are given.

Wenxin Jiang - One of the best experts on this subject based on the ideXlab platform.

  • Model selection in spline Nonparametric Regression
    Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2002
    Co-Authors: Sally Wood, Robert Kohn, Thomas S. Shively, Wenxin Jiang
    Abstract:

    A Bayesian approach is presented for model selection in Nonparametric Regression with Gaussian errors and in binary Nonparametric Regression. A smoothness prior is assumed for each component of the model and the posterior probabilities of the candidate models are approximated using the Bayesian information criterion. We study the model selection method by simulation and show that it has excellent frequentist properties and gives improved estimates of the Regression surface. All the computations are carried out efficiently using the Gibbs sampler.