ERIC Number: ED337463
Record Type: RIE
Publication Date: 1990-May
Reference Count: N/A
Generalizing Criterion-Related Validity Evidence for Certification Requirements across Situations and Specialty Areas.
Kane, Michael T.
Developing good criterion measures of professional performance is difficult. If criterion-related validity evidence for certification requirements could not be generalized beyond the specific context in which it was obtained, gathering that evidence would probably not be worth the effort. This paper examines two possible approaches to the generalization of criterion-related evidence for certification requirements. The first, validity generalization (meta-analysis), provides a statistical technique for generalizing the results of particular studies. However, the criterion problem remains; generalizing fluff (criterion-related evidence based on weak or inappropriate criteria) merely results in a more general kind of fluff. The second approach uses substantive models as the basis for generalizing validity data; this approach offers several advantages, including more emphasis on the nature of the criterion and, possibly, some help in developing better criteria. Four reasons for using substantive models in stead of statistical meta-analysis models are discussed, and four major conclusions are considered. A 27-item list of references is included. (Author/SLD)
Descriptors: Certification, Comparative Analysis, Concurrent Validity, Criterion Referenced Tests, Generalization, Licensing Examinations (Professions), Meta Analysis, Models, Occupational Tests, Test Validity
ACT Research Report Series, P.O. Box 168, Iowa City, IA 52243.
Publication Type: Reports - Evaluative
Education Level: N/A
Authoring Institution: American Coll. Testing Program, Iowa City, IA.
Identifiers: Validity Generalization