Assessment Center - Explore the Science & Experts | ideXlab


Scan Science and Technology

Contact Leading Edge Experts & Companies

Assessment Center

The Experts below are selected from a list of 106320 Experts worldwide ranked by ideXlab platform

Assessment Center – Free Register to Access Experts & Abstracts

Filip Lievens – One of the best experts on this subject based on the ideXlab platform.

  • Current Theory and Practice of Assessment Centers: The Importance of Trait Activation
    Oxford Handbooks Online, 2009
    Co-Authors: Filip Lievens, Liesbet De Koster, Eveline Schollaert

    Abstract:

    Assessment Centers have always had a strong link with practise. This link is so strong that the theoretical basis of the workings of an Assessment Center is sometimes questioned. This article posits that trait activation theory might be fruitfully used to explain how job-relevant candidate behavior is elicited and rated in Assessment Centers. Trait activation theory is a recent theory that focuses on the person–situation interaction to explain behavior based on responses to trait-relevant cues found in situations. These observable responses serve as the basis for behavioral ratings on dimensions used in a variety of Assessments such as performance appraisal and interviews, but in also Assessment Centers. The article starts by explaining the basic tenets behind the Assessment Center method and trait activation theory. It shows how trait activation theory might have key implications for current and future Assessment Center research. The article also provides various directions for future Assessment Center studies.

    Free Register to Access Article

  • predicting cross cultural training performance the validity of personality cognitive ability and dimensions measured by an Assessment Center and a behavior description interview
    Journal of Applied Psychology, 2003
    Co-Authors: Filip Lievens, Michael M Harris, Etienne Van Keer, Claire Bisqueret

    Abstract:

    This study examined the validity of a broad set of predictors for selecting European managers for a cross-cultural training program in Japan. The selection procedure assessed cognitive ability, personality, and dimensions measured by Assessment Center exercises and a behavior description interview. Results show that the factor Openness was significantly related to cross-cultural training performance, whereas cognitive ability was significantly correlated with language acquisition. The dimensions of adaptability, teamwork, and communication as measured by a group discussion exercise provided incremental variance in both criteria, beyond cognitive ability and personality. In general, these results are consistent with the literature on domestic selection, although there are some important differences.

    Free Register to Access Article

  • dimension and exercise variance in Assessment Center scores a large scale evaluation of multitrait multimethod studies
    Journal of Applied Psychology, 2001
    Co-Authors: Filip Lievens, James M Conway

    Abstract:

    This study addresses 3 questions regarding Assessment Center construct validity: (a) Are Assessment Center ratings best thought of as reflecting dimension constructs (dimension model), exercises (exercise model), or a combination? (b) To what extent do dimensions or exercises account for variance? (c) Which design characteristics increase dimension variance? To this end, a large set of multitrait-multimethod studies (N = 34) were analyzed, showing that Assessment Center ratings were best represented (i.e., in terms of fit and admissible solutions) by a model with correlated dimensions and exercises specified as correlated uniquenesses. In this model, dimension variance equals exercise variance. Significantly more dimension variance was found when fewer dimensions were used and when assessors were psychologists. Use of behavioral checklists, a lower dimension-exercise ratio, and similar exercises also increased dimension variance.

    Free Register to Access Article

Matthew S Fleisher – One of the best experts on this subject based on the ideXlab platform.

  • further evidence for the validity of Assessment Center dimensions a meta analysis of the incremental criterion related validity of dimension ratings
    Journal of Applied Psychology, 2008
    Co-Authors: John P Meriac, David J. Woehr, Brian J Hoffman, Matthew S Fleisher

    Abstract:

    This study investigates the incremental variance in job performance explained by Assessment Center (AC) dimensions over and above personality and cognitive ability. The authors extend previous research by using meta-analysis to examine the relationships between AC dimensions, personality, cognitive ability, and job performance. The results indicate that the 7 summary AC dimensions postulated by W. Arthur, Jr., E. A. Day, T. L. McNelly, & P. S. Edens (2003) are distinguishable from popular individual difference constructs and explain a sizeable proportion of variance in job performance beyond cognitive ability and personality.

    Free Register to Access Article

Pamela S. Edens – One of the best experts on this subject based on the ideXlab platform.

  • a meta analysis of the criterion related validity of Assessment Center dimensions
    Personnel Psychology, 2003
    Co-Authors: J Winfred R Arthur, Theresa L. Mcnelly, Eric Anthony Day, Pamela S. Edens

    Abstract:

    We used meta-analytic procedures to investigate the criterion-related validity of Assessment Center dimension ratings. By focusing on dimension-level information, we were able to assess the extent to which specific constructs account for the criterion-related validity of Assessment Centers. From a total of 34 articles that reported dimension-level validities, we collapsed 168 Assessment Center dimension labels into an overriding set of 6 dimensions: (a) consideration/awareness of others, (b) communication, (c) drive, (d) influencing others, (e) organizing and planning, and (f) problem solving. Based on this set of 6 dimensions, we extracted 258 independent data points. Results showed a range of estimated true criterion-related validities from .25 to .39. A regression-based composite consisting of 4 out of the 6 dimensions accounted for the criterion-related validity of Assessment Center ratings and explained more variance in performance (20%) than Gaugler, Rosenthal, Thornton, and Bentson (1987) were able to explain using the overall Assessment Center rating (14%).

    Free Register to Access Article

  • A META‐ANALYSIS OF THE CRITERION‐RELATED VALIDITY OF Assessment Center DIMENSIONS
    Personnel Psychology, 2003
    Co-Authors: Winfred Arthur, Eric Anthony Day, Theresa L. Mcnelly, Pamela S. Edens

    Abstract:

    We used meta-analytic procedures to investigate the criterion-related validity of Assessment Center dimension ratings. By focusing on dimension-level information, we were able to assess the extent to which specific constructs account for the criterion-related validity of Assessment Centers. From a total of 34 articles that reported dimension-level validities, we collapsed 168 Assessment Center dimension labels into an overriding set of 6 dimensions: (a) consideration/awareness of others, (b) communication, (c) drive, (d) influencing others, (e) organizing and planning, and (f) problem solving. Based on this set of 6 dimensions, we extracted 258 independent data points. Results showed a range of estimated true criterion-related validities from .25 to .39. A regression-based composite consisting of 4 out of the 6 dimensions accounted for the criterion-related validity of Assessment Center ratings and explained more variance in performance (20%) than Gaugler, Rosenthal, Thornton, and Bentson (1987) were able to explain using the overall Assessment Center rating (14%).

    Free Register to Access Article