Coverage Error

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 171 Experts worldwide ranked by ideXlab platform

Max H Farrell - One of the best experts on this subject based on the ideXlab platform.

  • nprobust nonparametric kernel based estimation and robust bias corrected inference
    Journal of Statistical Software, 2019
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    Nonparametric kernel density and local polynomial regression estimators are very popular in Statistics, Economics, and many other disciplines. They are routinely employed in applied work, either as part of the main empirical analysis or as a preliminary ingredient entering some other estimation or inference procedure. This article describes the main methodological and numerical features of the software package nprobust, which offers an array of estimation and inference procedures for nonparametric kernel-based density and local polynomial regression methods, implemented in both the R and Stata statistical platforms. The package includes not only classical bandwidth selection, estimation, and inference methods (Wand and Jones, 1995; Fan and Gijbels, 1996), but also other recent developments in the statistics and econometrics literatures such as robust bias-corrected inference and Coverage Error optimal bandwidth selection (Calonico, Cattaneo and Farrell, 2018, 2019). Furthermore, this article also proposes a simple way of estimating optimal bandwidths in practice that always delivers the optimal mean square Error convergence rate regardless of the specific evaluation point, that is, no matter whether it is implemented at a boundary or interior point. Numerical performance is illustrated using an empirical application and simulated data, where a detailed numerical comparison with other R packages is given.

  • Coverage Error Optimal Confidence Intervals
    arXiv: Econometrics, 2018
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    We propose a framework for ranking confidence interval estimators in terms of their uniform Coverage accuracy. The key ingredient is the (existence and) quantification of the Error in Coverage of competing confidence intervals, uniformly over some empirically-relevant class of data generating processes. The framework employs the "check" function to quantify Coverage Error loss, which allows researchers to incorporate their preference in terms of over- and under-Coverage, where confidence intervals attaining the best-possible uniform Coverage Error are minimax optimal. We demonstrate the usefulness of our framework with three distinct applications. First, we establish novel uniformly valid Edgeworth expansions for nonparametric local polynomial regression, offering some technical results that may be of independent interest, and use them to characterize the Coverage Error of and rank confidence interval estimators for the regression function and its derivatives. As a second application we consider inference in least squares linear regression under potential misspecification, ranking interval estimators utilizing uniformly valid expansions already established in the literature. Third, we study heteroskedasticity-autocorrelation robust inference to showcase how our framework can unify existing conclusions. Several other potential applications are mentioned.

  • on the effect of bias estimation on Coverage accuracy in nonparametric inference
    Journal of the American Statistical Association, 2018
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    AbstractNonparametric methods play a central role in modern empirical work. While they provide inference procedures that are more robust to parametric misspecification bias, they may be quite sensitive to tuning parameter choices. We study the effects of bias correction on confidence interval Coverage in the context of kernel density and local polynomial regression estimation, and prove that bias correction can be preferred to undersmoothing for minimizing Coverage Error and increasing robustness to tuning parameter choice. This is achieved using a novel, yet simple, Studentization, which leads to a new way of constructing kernel-based bias-corrected confidence intervals. In addition, for practical cases, we derive Coverage Error optimal bandwidths and discuss easy-to-implement bandwidth selectors. For interior points, we show that the MSE-optimal bandwidth for the original point estimator (before bias correction) delivers the fastest Coverage Error decay rate after bias correction when second-order (equi...

  • on the effect of bias estimation on Coverage accuracy in nonparametric inference
    arXiv: Statistics Theory, 2015
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    Nonparametric methods play a central role in modern empirical work. While they provide inference procedures that are more robust to parametric misspecification bias, they may be quite sensitive to tuning parameter choices. We study the effects of bias correction on confidence interval Coverage in the context of kernel density and local polynomial regression estimation, and prove that bias correction can be preferred to undersmoothing for minimizing Coverage Error and increasing robustness to tuning parameter choice. This is achieved using a novel, yet simple, Studentization, which leads to a new way of constructing kernel-based bias-corrected confidence intervals. In addition, for practical cases, we derive Coverage Error optimal bandwidths and discuss easy-to-implement bandwidth selectors. For interior points, we show that the MSE-optimal bandwidth for the original point estimator (before bias correction) delivers the fastest Coverage Error decay rate after bias correction when second-order (equivalent) kernels are employed, but is otherwise suboptimal because it is too "large". Finally, for odd-degree local polynomial regression, we show that, as with point estimation, Coverage Error adapts to boundary points automatically when appropriate Studentization is used; however, the MSE-optimal bandwidth for the original point estimator is suboptimal. All the results are established using valid Edgeworth expansions and illustrated with simulated data. Our findings have important consequences for empirical work as they indicate that bias-corrected confidence intervals, coupled with appropriate standard Errors, have smaller Coverage Error and are less sensitive to tuning parameter choices in practically relevant cases where additional smoothness is available.

Andy Weiss - One of the best experts on this subject based on the ideXlab platform.

  • zero banks Coverage Error and bias in rdd samples based on hundred banks with listed numbers
    Public Opinion Quarterly, 2009
    Co-Authors: John Boyle, Michael Bucuvalas, Linda Piekarski, Andy Weiss
    Abstract:

    List-assisted random digit dialing (RDD) is commonly used for sampling telephone households in the United States. The sampling frame is landline one hundred-series banks with one or more listed tele- phone numbers. The exclusion of banks without listed numbers from this truncated design has been justified by a 1995 study which found only 3.7 percent of working household numbers in unlisted banks with

  • zero banks Coverage Error and bias in rdd samples based on hundred banks with listed numbers
    Social Science Research Network, 2009
    Co-Authors: John Boyle, Michael Bucuvalas, Linda Piekarski, Andy Weiss
    Abstract:

    List-assisted random digit dialing (RDD) is commonly used for sampling telephone households in the United States. The sampling frame is landline one hundred-series banks with one or more listed telephone numbers. The exclusion of banks without listed numbers from this truncated design has been justified by a 1995 study which found only 3.7 percent of working household numbers in unlisted banks with no significant demographic biases [Brick et al. 1995 (“Bias in List-Assisted Telephone Samples.” Public Opinion Quarterly 59:218-235)]. A recent study [Fahimi, Kulp, and Brick et al. 2008b (“Bias in List-Assisted 100-Series RDD Sampling.” Survey Practice, September 28, 2008)] re-examined the Coverage of landline households in listed banks. The authors concluded that “the Coverage loss for designs based on the 1+ listed banks is closer to 20 percent than 4 percent” today. Such Coverage Error calls into question the acceptability of current RDD sampling procedures for landline households, and in combination with cell phone Coverage issues, the very future of telephone surveys. The current study attempted to replicate the Fahimi study using a different sample vendor and more elaborate procedures to establish household status and characteristics of households in unlisted banks. Based on a national RDD sample of 10,000 numbers from 1+ listed banks and 27,175 numbers from unlisted banks, we found that 95 percent of landline households are still located in 1+ listed banks. However, while the Coverage Error from unlisted telephone banks is only slightly higher today than a decade ago, there is now a measurable bias in the excluded households toward younger, lower income, minority and rental households. This bias will be particularly problematic for telephone samples that also do not include cell phone only households.

Sebastian Calonico - One of the best experts on this subject based on the ideXlab platform.

  • nprobust nonparametric kernel based estimation and robust bias corrected inference
    Journal of Statistical Software, 2019
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    Nonparametric kernel density and local polynomial regression estimators are very popular in Statistics, Economics, and many other disciplines. They are routinely employed in applied work, either as part of the main empirical analysis or as a preliminary ingredient entering some other estimation or inference procedure. This article describes the main methodological and numerical features of the software package nprobust, which offers an array of estimation and inference procedures for nonparametric kernel-based density and local polynomial regression methods, implemented in both the R and Stata statistical platforms. The package includes not only classical bandwidth selection, estimation, and inference methods (Wand and Jones, 1995; Fan and Gijbels, 1996), but also other recent developments in the statistics and econometrics literatures such as robust bias-corrected inference and Coverage Error optimal bandwidth selection (Calonico, Cattaneo and Farrell, 2018, 2019). Furthermore, this article also proposes a simple way of estimating optimal bandwidths in practice that always delivers the optimal mean square Error convergence rate regardless of the specific evaluation point, that is, no matter whether it is implemented at a boundary or interior point. Numerical performance is illustrated using an empirical application and simulated data, where a detailed numerical comparison with other R packages is given.

  • Coverage Error Optimal Confidence Intervals
    arXiv: Econometrics, 2018
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    We propose a framework for ranking confidence interval estimators in terms of their uniform Coverage accuracy. The key ingredient is the (existence and) quantification of the Error in Coverage of competing confidence intervals, uniformly over some empirically-relevant class of data generating processes. The framework employs the "check" function to quantify Coverage Error loss, which allows researchers to incorporate their preference in terms of over- and under-Coverage, where confidence intervals attaining the best-possible uniform Coverage Error are minimax optimal. We demonstrate the usefulness of our framework with three distinct applications. First, we establish novel uniformly valid Edgeworth expansions for nonparametric local polynomial regression, offering some technical results that may be of independent interest, and use them to characterize the Coverage Error of and rank confidence interval estimators for the regression function and its derivatives. As a second application we consider inference in least squares linear regression under potential misspecification, ranking interval estimators utilizing uniformly valid expansions already established in the literature. Third, we study heteroskedasticity-autocorrelation robust inference to showcase how our framework can unify existing conclusions. Several other potential applications are mentioned.

  • on the effect of bias estimation on Coverage accuracy in nonparametric inference
    Journal of the American Statistical Association, 2018
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    AbstractNonparametric methods play a central role in modern empirical work. While they provide inference procedures that are more robust to parametric misspecification bias, they may be quite sensitive to tuning parameter choices. We study the effects of bias correction on confidence interval Coverage in the context of kernel density and local polynomial regression estimation, and prove that bias correction can be preferred to undersmoothing for minimizing Coverage Error and increasing robustness to tuning parameter choice. This is achieved using a novel, yet simple, Studentization, which leads to a new way of constructing kernel-based bias-corrected confidence intervals. In addition, for practical cases, we derive Coverage Error optimal bandwidths and discuss easy-to-implement bandwidth selectors. For interior points, we show that the MSE-optimal bandwidth for the original point estimator (before bias correction) delivers the fastest Coverage Error decay rate after bias correction when second-order (equi...

  • on the effect of bias estimation on Coverage accuracy in nonparametric inference
    arXiv: Statistics Theory, 2015
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    Nonparametric methods play a central role in modern empirical work. While they provide inference procedures that are more robust to parametric misspecification bias, they may be quite sensitive to tuning parameter choices. We study the effects of bias correction on confidence interval Coverage in the context of kernel density and local polynomial regression estimation, and prove that bias correction can be preferred to undersmoothing for minimizing Coverage Error and increasing robustness to tuning parameter choice. This is achieved using a novel, yet simple, Studentization, which leads to a new way of constructing kernel-based bias-corrected confidence intervals. In addition, for practical cases, we derive Coverage Error optimal bandwidths and discuss easy-to-implement bandwidth selectors. For interior points, we show that the MSE-optimal bandwidth for the original point estimator (before bias correction) delivers the fastest Coverage Error decay rate after bias correction when second-order (equivalent) kernels are employed, but is otherwise suboptimal because it is too "large". Finally, for odd-degree local polynomial regression, we show that, as with point estimation, Coverage Error adapts to boundary points automatically when appropriate Studentization is used; however, the MSE-optimal bandwidth for the original point estimator is suboptimal. All the results are established using valid Edgeworth expansions and illustrated with simulated data. Our findings have important consequences for empirical work as they indicate that bias-corrected confidence intervals, coupled with appropriate standard Errors, have smaller Coverage Error and are less sensitive to tuning parameter choices in practically relevant cases where additional smoothness is available.

Sheinchung Chow - One of the best experts on this subject based on the ideXlab platform.

  • modified large sample confidence intervals for linear combinations of variance components extension theory and application
    Journal of the American Statistical Association, 2004
    Co-Authors: Yonghee Lee, Jun Shao, Sheinchung Chow
    Abstract:

    We consider the problem of setting a confidence interval or bound for a linear combination of variance components related to a multivariate normal distribution, which includes important applications such as comparing variance components and testing the bioequivalence between two drug products. The lack of an exact confidence interval for a general linear combination of variance components spurred the development of a modified large-sample (MLS) method that was shown to be superior to many other approximation methods. But existing MLS method requires the use of independent estimators of variance components. Using a chi-squared representation of a quadratic form of a multivariate normal vector, we extend the MLS method to situations in which estimators of variance components are dependent. Using Edgeworth and Cornish–Fisher expansions, we explicitly derive the second-order asymptotic Coverage Error of the MLS confidence bound. Our results show that the MLS confidence bound is not second-order accurate in ge...

  • modified large sample confidence intervals for linear combinations of variance components
    Journal of the American Statistical Association, 2004
    Co-Authors: Yonghee Lee, Jun Shao, Sheinchung Chow
    Abstract:

    We consider the problem of setting a confidence interval or bound for a linear combination of variance components related to a multivariate normal distribution, which includes important applications such as comparing variance components and testing the bioequivalence between two drug products. The lack of an exact confidence interval for a general linear combination of variance components spurred the development of a modified large-sample (MLS) method that was shown to be superior to many other approximation methods. But existing MLS method requires the use of independent estimators of variance components. Using a chi-squared representation of a quadratic form of a multivariate normal vector, we extend the MLS method to situations in which estimators of variance components are dependent. Using Edgeworth and Cornish–Fisher expansions, we explicitly derive the second-order asymptotic Coverage Error of the MLS confidence bound. Our results show that the MLS confidence bound is not second-order accurate in ge...

Matias D. Cattaneo - One of the best experts on this subject based on the ideXlab platform.

  • nprobust nonparametric kernel based estimation and robust bias corrected inference
    Journal of Statistical Software, 2019
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    Nonparametric kernel density and local polynomial regression estimators are very popular in Statistics, Economics, and many other disciplines. They are routinely employed in applied work, either as part of the main empirical analysis or as a preliminary ingredient entering some other estimation or inference procedure. This article describes the main methodological and numerical features of the software package nprobust, which offers an array of estimation and inference procedures for nonparametric kernel-based density and local polynomial regression methods, implemented in both the R and Stata statistical platforms. The package includes not only classical bandwidth selection, estimation, and inference methods (Wand and Jones, 1995; Fan and Gijbels, 1996), but also other recent developments in the statistics and econometrics literatures such as robust bias-corrected inference and Coverage Error optimal bandwidth selection (Calonico, Cattaneo and Farrell, 2018, 2019). Furthermore, this article also proposes a simple way of estimating optimal bandwidths in practice that always delivers the optimal mean square Error convergence rate regardless of the specific evaluation point, that is, no matter whether it is implemented at a boundary or interior point. Numerical performance is illustrated using an empirical application and simulated data, where a detailed numerical comparison with other R packages is given.

  • Coverage Error Optimal Confidence Intervals
    arXiv: Econometrics, 2018
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    We propose a framework for ranking confidence interval estimators in terms of their uniform Coverage accuracy. The key ingredient is the (existence and) quantification of the Error in Coverage of competing confidence intervals, uniformly over some empirically-relevant class of data generating processes. The framework employs the "check" function to quantify Coverage Error loss, which allows researchers to incorporate their preference in terms of over- and under-Coverage, where confidence intervals attaining the best-possible uniform Coverage Error are minimax optimal. We demonstrate the usefulness of our framework with three distinct applications. First, we establish novel uniformly valid Edgeworth expansions for nonparametric local polynomial regression, offering some technical results that may be of independent interest, and use them to characterize the Coverage Error of and rank confidence interval estimators for the regression function and its derivatives. As a second application we consider inference in least squares linear regression under potential misspecification, ranking interval estimators utilizing uniformly valid expansions already established in the literature. Third, we study heteroskedasticity-autocorrelation robust inference to showcase how our framework can unify existing conclusions. Several other potential applications are mentioned.

  • on the effect of bias estimation on Coverage accuracy in nonparametric inference
    Journal of the American Statistical Association, 2018
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    AbstractNonparametric methods play a central role in modern empirical work. While they provide inference procedures that are more robust to parametric misspecification bias, they may be quite sensitive to tuning parameter choices. We study the effects of bias correction on confidence interval Coverage in the context of kernel density and local polynomial regression estimation, and prove that bias correction can be preferred to undersmoothing for minimizing Coverage Error and increasing robustness to tuning parameter choice. This is achieved using a novel, yet simple, Studentization, which leads to a new way of constructing kernel-based bias-corrected confidence intervals. In addition, for practical cases, we derive Coverage Error optimal bandwidths and discuss easy-to-implement bandwidth selectors. For interior points, we show that the MSE-optimal bandwidth for the original point estimator (before bias correction) delivers the fastest Coverage Error decay rate after bias correction when second-order (equi...

  • on the effect of bias estimation on Coverage accuracy in nonparametric inference
    arXiv: Statistics Theory, 2015
    Co-Authors: Sebastian Calonico, Matias D. Cattaneo, Max H Farrell
    Abstract:

    Nonparametric methods play a central role in modern empirical work. While they provide inference procedures that are more robust to parametric misspecification bias, they may be quite sensitive to tuning parameter choices. We study the effects of bias correction on confidence interval Coverage in the context of kernel density and local polynomial regression estimation, and prove that bias correction can be preferred to undersmoothing for minimizing Coverage Error and increasing robustness to tuning parameter choice. This is achieved using a novel, yet simple, Studentization, which leads to a new way of constructing kernel-based bias-corrected confidence intervals. In addition, for practical cases, we derive Coverage Error optimal bandwidths and discuss easy-to-implement bandwidth selectors. For interior points, we show that the MSE-optimal bandwidth for the original point estimator (before bias correction) delivers the fastest Coverage Error decay rate after bias correction when second-order (equivalent) kernels are employed, but is otherwise suboptimal because it is too "large". Finally, for odd-degree local polynomial regression, we show that, as with point estimation, Coverage Error adapts to boundary points automatically when appropriate Studentization is used; however, the MSE-optimal bandwidth for the original point estimator is suboptimal. All the results are established using valid Edgeworth expansions and illustrated with simulated data. Our findings have important consequences for empirical work as they indicate that bias-corrected confidence intervals, coupled with appropriate standard Errors, have smaller Coverage Error and are less sensitive to tuning parameter choices in practically relevant cases where additional smoothness is available.