Unit Nonresponse

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 1215 Experts worldwide ranked by ideXlab platform

Bart Orriens - One of the best experts on this subject based on the ideXlab platform.

  • Personality as a Predictor of Unit Nonresponse in an Internet Panel
    Sociological Methods & Research, 2020
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse in panel data sets is often a source of bias. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an Internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an Internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to ...

  • Personality as a Predictor of Unit Nonresponse in an Internet Panel
    Sociological Methods & Research, 2018
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse in panel data sets is often a source of bias. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an Internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an Internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to experience predict the incidence of Unit Nonresponse in subsequent survey waves, even after controlling for cognitive ability and demographic characteristics that are usually available and used by researchers to correct for Unit Nonresponse. We also test the potential to use paradata as proxies for personality traits related to Unit Nonresponse. Although we show that these proxies are correlated with personality traits and predict Unit Nonresponse in the same way as self-reported measures of personality traits, it is also possible that they capture other idiosyncrasies related to future survey completion. Our results suggest that obtaining explicit measures of personality traits or finding better proxies for them could be valuable for more fully addressing the potential bias that may arise as a result of Unit Nonresponse.

  • Personality as a Predictor of Unit Nonresponse in Panel Data: An Analysis of an Internet-Based Survey
    SSRN Electronic Journal, 2016
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse or attrition in panel data sets is often a source of nonrandom measurement error. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to experience predict the incidence of Unit Nonresponse in subsequent survey waves, even after controlling for cognitive ability and demographic characteristics that are usually available and used by researchers to correct for panel attrition. We also test the potential to use paradata as proxies for personality traits. Although we show that these proxies predict panel attrition in the same way as self-reported measures of personality traits, it is unclear to what extent they capture particular personality traits versus other individual circumstances related to future survey completion. Our results suggest that obtaining explicit measures of personality traits or finding better proxies for them are crucial to more fully address the potential bias that may arise as a result of panel attrition.

Albert Cheng - One of the best experts on this subject based on the ideXlab platform.

  • Personality as a Predictor of Unit Nonresponse in an Internet Panel
    Sociological Methods & Research, 2020
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse in panel data sets is often a source of bias. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an Internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an Internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to ...

  • Personality as a Predictor of Unit Nonresponse in an Internet Panel
    Sociological Methods & Research, 2018
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse in panel data sets is often a source of bias. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an Internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an Internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to experience predict the incidence of Unit Nonresponse in subsequent survey waves, even after controlling for cognitive ability and demographic characteristics that are usually available and used by researchers to correct for Unit Nonresponse. We also test the potential to use paradata as proxies for personality traits related to Unit Nonresponse. Although we show that these proxies are correlated with personality traits and predict Unit Nonresponse in the same way as self-reported measures of personality traits, it is also possible that they capture other idiosyncrasies related to future survey completion. Our results suggest that obtaining explicit measures of personality traits or finding better proxies for them could be valuable for more fully addressing the potential bias that may arise as a result of Unit Nonresponse.

  • Personality as a Predictor of Unit Nonresponse in Panel Data: An Analysis of an Internet-Based Survey
    SSRN Electronic Journal, 2016
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse or attrition in panel data sets is often a source of nonrandom measurement error. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to experience predict the incidence of Unit Nonresponse in subsequent survey waves, even after controlling for cognitive ability and demographic characteristics that are usually available and used by researchers to correct for panel attrition. We also test the potential to use paradata as proxies for personality traits. Although we show that these proxies predict panel attrition in the same way as self-reported measures of personality traits, it is unclear to what extent they capture particular personality traits versus other individual circumstances related to future survey completion. Our results suggest that obtaining explicit measures of personality traits or finding better proxies for them are crucial to more fully address the potential bias that may arise as a result of panel attrition.

J. N. K. Rao - One of the best experts on this subject based on the ideXlab platform.

  • Jackknife Variance Estimation under Imputation for Estimators Using Poststratification Information
    Journal of the American Statistical Association, 2000
    Co-Authors: W. Yung, J. N. K. Rao
    Abstract:

    Abstract Poststratified estimators are commonly used in sample surveys to improve the efficiency of estimators and to ensure calibration to known poststrata counts. Similarly, generalized regression estimators are used to handle two or more poststratifiers with known marginal counts. In addition, weighting adjustment within weighting classes is used to handle Unit Nonresponse, and imputation within imputation classes is used to handle item Nonresponse. For the full response case, asymptotic consistency of the jackknife variance estimator under stratified multistage sampling is established using mild regularity conditions on “residuals” similar to those of Scott and Wu for ratio and regression estimation under simple random sampling. A jackknife linearization variance estimator, obtained by linearizing the jackknife variance estimator, is also given. For Unit Nonresponse, the general case of poststrata cutting across weighting classes is considered, and a jackknife variance estimator and the corresponding ...

  • On Variance Estimation with Imputed Survey Data
    Journal of the American Statistical Association, 1996
    Co-Authors: J. N. K. Rao
    Abstract:

    Abstract Unit Nonresponse and item Nonresponse both occur frequently in surveys. Unit Nonresponse is customarily handled by weighting adjustment, whereas item Nonresponse is usually treated by some form of imputation. In particular, deterministic or stochastic imputation is often used to assign values for missing item responses. We provide an account of some recent work on jackknife variance estimation based on adjusted imputed values, using only a single imputation and hence a single completed data set. We also present linearized versions of the jackknife variance estimators. We study both stratified simple random sampling and stratified multistage sampling. Existing computer programs for jackknife and linearization variance estimation can be modified to implement the proposed variance estimators without requiring the creation and permanent retention of supplemental data sets. But for secondary analyses, the completed data set must include information on response status for each item as well as on the im...

  • On Variance Estimation With Imputed Survey Data: Comment
    Journal of the American Statistical Association, 1996
    Co-Authors: J. N. K. Rao, D. R. Judkins, D. A. Binder, J. L. Eltinge, Donald B. Rubin, R. E. Fay
    Abstract:

    Unit Nonresponse and item Nonresponse both occur frequently in surveys. Unit Nonresponse is customarily handled by weighting adjustment, whereas item Nonresponse is usually treated by some form of imputation. In particular, deterministic or stochastic imputation is often used to assign values for missing item responses. We provide an account of some recent work on jackknife variance estimation based on adjusted imputed values, using only a single imputation and hence a single completed data set. We also present linearized versions of the jackknife variance estimators. We study both stratified simple random sampling and stratified multistage sampling. Existing computer programs for jackknife and linearization variance estimation can be modified to implement the proposed variance estimators without requiring the creation and permanent retention of supplemental data sets. But for secondary analyses, the completed data set must include information on response status for each item as well as on the imputation class.

Gema Zamarro - One of the best experts on this subject based on the ideXlab platform.

  • Personality as a Predictor of Unit Nonresponse in an Internet Panel
    Sociological Methods & Research, 2020
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse in panel data sets is often a source of bias. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an Internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an Internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to ...

  • Personality as a Predictor of Unit Nonresponse in an Internet Panel
    Sociological Methods & Research, 2018
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse in panel data sets is often a source of bias. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an Internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an Internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to experience predict the incidence of Unit Nonresponse in subsequent survey waves, even after controlling for cognitive ability and demographic characteristics that are usually available and used by researchers to correct for Unit Nonresponse. We also test the potential to use paradata as proxies for personality traits related to Unit Nonresponse. Although we show that these proxies are correlated with personality traits and predict Unit Nonresponse in the same way as self-reported measures of personality traits, it is also possible that they capture other idiosyncrasies related to future survey completion. Our results suggest that obtaining explicit measures of personality traits or finding better proxies for them could be valuable for more fully addressing the potential bias that may arise as a result of Unit Nonresponse.

  • Personality as a Predictor of Unit Nonresponse in Panel Data: An Analysis of an Internet-Based Survey
    SSRN Electronic Journal, 2016
    Co-Authors: Albert Cheng, Gema Zamarro, Bart Orriens
    Abstract:

    Unit Nonresponse or attrition in panel data sets is often a source of nonrandom measurement error. Why certain individuals attrite from longitudinal studies and how to minimize this phenomenon have been examined by researchers. However, this research has typically focused on data sets collected via telephone, postal mail, or face-to-face interviews. Moreover, this research usually focuses on using demographic characteristics such as educational attainment or income to explain variation in the incidence of Unit Nonresponse. We make two contributions to the existing literature. First, we examine the incidence of Unit Nonresponse in an internet panel, a relatively new, and hence understudied, approach to gathering longitudinal data. Second, we hypothesize that personality traits, which typically remain unobserved and unmeasured in many data sets, affect the likelihood of Unit Nonresponse. Using data from an internet panel that includes self-reported measures of personality in its baseline survey, we find that conscientiousness and openness to experience predict the incidence of Unit Nonresponse in subsequent survey waves, even after controlling for cognitive ability and demographic characteristics that are usually available and used by researchers to correct for panel attrition. We also test the potential to use paradata as proxies for personality traits. Although we show that these proxies predict panel attrition in the same way as self-reported measures of personality traits, it is unclear to what extent they capture particular personality traits versus other individual circumstances related to future survey completion. Our results suggest that obtaining explicit measures of personality traits or finding better proxies for them are crucial to more fully address the potential bias that may arise as a result of panel attrition.

Ulrich Kohler - One of the best experts on this subject based on the ideXlab platform.

  • Unit Nonresponse biases in estimates of SARS-CoV-2 prevalence
    Survey research methods, 2020
    Co-Authors: Julia C. Post, Fabian Class, Ulrich Kohler
    Abstract:

    Since COVID-19 became a pandemic, many studies are being conducted to get a better un-derstanding of the disease itself and its spread. One crucial indicator is the prevalence of SARS-CoV-2 infections. Since this measure is an important foundation for political decisions, its estimate must be reliable and unbiased. This paper presents reasons for biases in prevalence estimates due to Unit Nonresponse in typical studies. Since it is difficult to avoid bias in situ-ations with mostly unknown Nonresponse mechanisms, we propose the maximum amount of bias as one measure to assess the uncertainty due to Nonresponse. An interactive web application is presented that calculates the limits of such a conservative Unit Nonresponse confidence interval (CUNCI).

  • Surveys from inside: An assessment of Unit Nonresponse bias with internal criteria
    Survey research methods, 2007
    Co-Authors: Ulrich Kohler
    Abstract:

    The article uses the so called “internal criteria of representativeness” to assess the Unit Nonresponse bias in five European comparative survey projects. It then goes on investigating several ideas why Unit Nonresponse bias might vary between surveys and countries. It is proposed that Unit Nonresponse bias is either caused by country characteristics or survey methodology. The empirical evidence presented speaks more in favour of the latter than of the former. Among the survey characteristics the features that strengthen the leverage to control interviewers’ behaviour have top priority