NotesFAQContact Us
Search Tips
ERIC Number: ED338699
Record Type: RIE
Publication Date: 1991-Jul
Pages: 4
Abstractor: N/A
Reference Count: N/A
The Case for Validity Generalization. ERIC/TM Digest.
Rafilson, Fred
An important issue in educational and employment settings is the degree to which evidence of validity obtained in one situation can be generalized to another situation without further study of validity in the new situation. Theory, procedures, and applications concerning validity generalization are addressed. Meta-analytic techniques make possible a comparative process to determine if the criterion-related validity of a test is relatively stable or if the test is valid only in certain situations. The criterion-related validity of a test in a local situation is usually only inferred if the findings reach a level of magnitude called statistical significance. A common procedure for conducting a meta-analysis to determine the degree to which validity findings can be generalized involves: (1) estimating the population validity by computing the mean of the observed sample validities; (2) correcting the observed validities by removing the effects of statistical artifacts; and (3) finding the variance of the corrected observed validities. If the variance of the corrected observed validity is nearly zero, then the validity generalizes and can be transported to other situations or locations. Three models currently exist for assessing validity generalization (the correlation model, the covariance model, and the regression slope model). Validity generalization studies are usually used to draw scientific conclusions about the relationships between variables and to support the use of a test in a new situation. Four references are listed. (SLD)
Publication Type: ERIC Publications; Reports - Evaluative; ERIC Digests in Full Text
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Office of Educational Research and Improvement (ED), Washington, DC.
Authoring Institution: ERIC Clearinghouse on Tests, Measurement, and Evaluation, Washington, DC.
Identifiers: ERIC Digests; Validity Generalization