ERIC Number: ED409368
Record Type: Non-Journal
Publication Date: 1995-Apr-21
Reference Count: N/A
The Robustness of the Standard Error of Summarized, Corrected Validity Coefficients to Non-Independence and Non-Normality of Primary Data.
Lambert, Richard G.; Curlette, William L.
Validity generalization meta-analysis (VG) examines the extent to which the validity of an instrument can be transported across settings. VG offers correction and summarization procedures designed in part to remove the effects of statistical artifacts on estimates of association between criterion and predictor. By employing a random effects model, the variability of a distribution, "P," of population parameters, "p," is estimated. When the variance of this distribution is estimated to be small, validity is said to generalize across situations. It is common for an admissible validity study to contribute more than one correlation to a meta-analysis. The original VG meta-analysis (Pearlman, Schmidt, and Hunter, 1980) located 3,368 validity coefficients in 698 studies. In addition, VG is often applied to instruments used to predict success on highly complex jobs. Such measures often have positively skewed distributions of predictor and criterion scores (Hunter, 1990). This study used Monte Carlo simulation to generate situations with nonnormal distributions and dependency between effect sizes. Specifically, this effort tested the robustness of VG, as applied with the Raju et al. (1991) standard error of corrected correlations, to violations of the assumptions of independence and normality of primary data. Results of generations of 10,000 replications in 3,024 different combinations of conditions indicate that averaging correlations at the level of the primary study greatly underestimates the variance of P while skewness leads to overestimates of the variance of P. (Contains nine tables and seven figures.) (Author/SLD)
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Authoring Institution: N/A