The Experts below are selected from a list of 312 Experts worldwide ranked by ideXlab platform
Gilles Durrieu - One of the best experts on this subject based on the ideXlab platform.
-
nonparametric recursive estimation of the derivative of the Regression Function with application to sea shores water quality
Statistical Inference for Stochastic Processes, 2019Co-Authors: Bernard Bercu, Sami Capderou, Gilles DurrieuAbstract:This paper is devoted to the nonparametric estimation of the derivative of the Regression Function in a nonparametric Regression model. We implement a very efficient and easy to handle statistical procedure based on the derivative of the recursive Nadaraya–Watson estimator. We establish the almost sure convergence as well as the asymptotic normality for our estimates. We also illustrate our nonparametric estimation procedure on simulated data and real life data associated with sea shores water quality and valvometry.
-
Nonparametric estimation of the derivative of the Regression Function: application to sea shores water quality
arXiv: Statistics Theory, 2016Co-Authors: Bernard Bercu, Sami Capderou, Gilles DurrieuAbstract:This paper is devoted to the nonparametric estimation of the derivative of the Regression Function in a nonparametric Regression model. We implement a very efficient and easy to handle statistical procedure based on the derivative of the recursive Nadaraya-Watson estimator. We establish the almost sure convergence as well as the asymptotic normality for our estimates. We also illustrate our nonparametric estimation procedure on simulated and real life data associated with sea shores water quality and valvometry.
Holger Dette - One of the best experts on this subject based on the ideXlab platform.
-
A simple nonparametric estimator of a monotone Regression Function
Technical reports, 2020Co-Authors: Holger Dette, Natalie Neumeyer, Kay F PilzAbstract:In this paper a new method for monotone estimation of a Regression Function is proposed. The estimator is obtained by the combination of a density and a Regression estimate and is appealing to users of conventional smoothing methods as kernel estimators, local polynomials, series estimators or smoothing splines. The main idea of the new approach is to construct a density estimate from the estimated values ˆm(i/N) (i = 1, . . . ,N) of the Regression Function to use these “data” for the calculation of an estimate of the inverse of the Regression Function. The final estimate is then obtained by a numerical inversion. Compared to the conventially used techniques for monotone estimation the new method is computationally more efficient, because it does not require constrained optimization techniques for the calculation of the estimate. We prove asymptotic normality of the new estimate and compare the asymptotic properties with the unconstrained estimate. In particular it is shown that for kernel estimates or local polynomials the monotone estimate is first order asymptotically equivalent to the unconstrained estimate. We also illustrate the performance of the new procedure by means of a simulation study.
-
estimating a convex Function in nonparametric Regression
Scandinavian Journal of Statistics, 2007Co-Authors: Melanie Birke, Holger DetteAbstract:A new nonparametric estimate of a convex Regression Function is proposed and its stochastic properties are studied. The method starts with an unconstrained estimate of the derivative of the Regression Function, which is rstly isotonized and then integrated. We prove asymptotic normality of the new estimate and show that it is rst order asymptotically equivalent to the initial unconstrained estimate if the Regression Function is in fact convex. If convexity is not present the method estimates a convex Function whose derivative has the same L p -norm as the derivative of the (non-convex) underlying Regression Function. The nite sample properties of the new estimate are investigated by means of a simulation study and the application of the new method is demonstrated in two data examples.
-
a simple nonparametric estimator of a strictly monotone Regression Function
Bernoulli, 2006Co-Authors: Holger Dette, Natalie Neumeyer, Kay F PilzAbstract:A new method for monotone estimation of a Regression Function is proposed, which is potentially attractive to users of conventional smoothing methods. The main idea of the new approach is to construct a density estimate from the estimated values m(i/N) (i = 1, ..., N) of the Regression Function and to use these 'data' for the calculation of an estimate of the inverse of the Regression Function. The final estimate is then obtained by a numerical inversion. Compared to the currently available techniques for monotone estimation the new method does not require constrained optimization. We prove asymptotic normality of the new estimate and compare the asymptotic properties with the unconstrained estimate. In particular, it is shown that for kernel estimates or local polynomials the bandwidths in the procedure can be chosen such that the monotone estimate is first order asymptotically equivalent to the unconstrained estimate. We also illustrate the performance of the new procedure by means of a simulation study.
-
strictly monotone and smooth nonparametric Regression for two or more variables
Technical reports, 2005Co-Authors: Regine Scheder, Holger DetteAbstract:In this article a new monotone nonparametric estimate for a Regression Function of two or more variables is proposed. The method starts with an unconstrained nonparametric Regression estimate and uses successively one-dimensional isotonization procedures. In the case of a strictly monotone Regression Function, it is shown that the new estimate is first order asymptotic equivalent to the unconstrained estimate, and asymptotic normality of an appropriate standardization of the estimate is established. Moreover, if the Regression Function is not monotone in one of its arguments, the constructed estimate has approximately the same Lp-norm as the initial unconstrained estimate. The methodology is also illustrated by means of a simulation study, and two data examples are analyzed.
Bernard Bercu - One of the best experts on this subject based on the ideXlab platform.
-
nonparametric recursive estimation of the derivative of the Regression Function with application to sea shores water quality
Statistical Inference for Stochastic Processes, 2019Co-Authors: Bernard Bercu, Sami Capderou, Gilles DurrieuAbstract:This paper is devoted to the nonparametric estimation of the derivative of the Regression Function in a nonparametric Regression model. We implement a very efficient and easy to handle statistical procedure based on the derivative of the recursive Nadaraya–Watson estimator. We establish the almost sure convergence as well as the asymptotic normality for our estimates. We also illustrate our nonparametric estimation procedure on simulated data and real life data associated with sea shores water quality and valvometry.
-
Nonparametric estimation of the derivative of the Regression Function: application to sea shores water quality
arXiv: Statistics Theory, 2016Co-Authors: Bernard Bercu, Sami Capderou, Gilles DurrieuAbstract:This paper is devoted to the nonparametric estimation of the derivative of the Regression Function in a nonparametric Regression model. We implement a very efficient and easy to handle statistical procedure based on the derivative of the recursive Nadaraya-Watson estimator. We establish the almost sure convergence as well as the asymptotic normality for our estimates. We also illustrate our nonparametric estimation procedure on simulated and real life data associated with sea shores water quality and valvometry.
Marie Sauve - One of the best experts on this subject based on the ideXlab platform.
-
Piecewise Polynomial Estimation of a Regression Function
IEEE Transactions on Information Theory, 2010Co-Authors: Marie SauveAbstract:We deal with the problem of choosing a piecewise polynomial estimator of a Regression Function s mapping [0,1] p into R. In a first part of this paper, we consider some collection of piecewise polynomial models. Each model is defined by a partition M of [0,1] p and a series of degrees d = (d J)J¿M ¿ NM. We propose a penalized least squares criterion which selects a model whose associated piecewise polynomial estimator performs approximately as well as the best one, in the sense that its quadratic risk is close to the infimum of the risks. The risk bound we provide is nonasymptotic. In a second part, we apply this result to tree-structured collections of partitions, which look like the one constructed in the first step of the CART algorithm. And we propose an extension of the CART algorithm to build a piecewise polynomial estimator of a Regression Function.
Runze Li - One of the best experts on this subject based on the ideXlab platform.
-
new local estimation procedure for a non parametric Regression Function for longitudinal data
Journal of The Royal Statistical Society Series B-statistical Methodology, 2013Co-Authors: Runze LiAbstract:This paper develops a new estimation of nonparametric Regression Functions for clustered or longitudinal data. We propose to use Cholesky decomposition and profile least squares techniques to estimate the correlation structure and Regression Function simultaneously. We further prove that the proposed estimator is as asymptotically efficient as if the covariance matrix were known. A Monte Carlo simulation study is conducted to examine the finite sample performance of the proposed procedure, and to compare the proposed procedure with the existing ones. Based on our empirical studies, the newly proposed procedure works better than the naive local linear Regression with working independence error structure and the efficiency gain can be achieved in moderate-sized samples. Our numerical comparison also shows that the newly proposed procedure outperforms some existing ones. A real data set application is also provided to illustrate the proposed estimation procedure.