NotesFAQContact Us
Search Tips
ERIC Number: ED380498
Record Type: Non-Journal
Publication Date: 1994-Oct
Pages: 40
Abstractor: N/A
An Alternative Method for Scoring Adaptive Tests. Research Report RR-94-48.
Stocking, Martha L.
Modern applications of computerized adaptive testing (CAT) are typically grounded in item response theory (IRT; Lord, 1980). While the IRT foundations of adaptive testing provide a number of approaches to adaptive test scoring that may seem natural and efficient to psychometricians, these approaches may be more demanding for test takers, test score users, interested regulatory institutions, and so forth, to comprehend. An alternative method, based on more familiar equated number-correct scores and identical to that used to score and equate many conventional tests, is explored and compared with one that relies more directly on IRT. The conclusion is reached that scoring adaptive tests using the familiar number-correct score, accompanied by the necessary equating to adjust for the intentional differences in adaptive test difficulty, is a statistically viable, although slightly less efficient, method of adaptive test scoring. To enhance the prospects for enlightened public debate about adaptive testing, it may be preferable to use this more familiar approach. Public attention would then likely be focused on issues more central to adaptive testing, namely the adaptive nature of the test. (Contains 35 references, 2 tables, and 3 figures.) (Author)
Publication Type: Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Educational Testing Service, Princeton, NJ.