ERIC Number: ED478974
Record Type: Non-Journal
Publication Date: 2003-Apr
Reference Count: N/A
A Study of Equating in NAEP. NAEP Validity Studies. Working Paper Series.
Hedges, Larry V.; Vevea, Jack L.
A computer simulation study was conducted to investigate the amount of uncertainty added to National Assessment of Educational Progress estimates by equating error under three different equating methods and while varying a number of factors that might affect accuracy of equating. Data from past NAEP administrations were used to guide the simulations, and error due to equating was estimated empirically. Factors investigated were number of items in the scale, proportion of items in the scale taken by each student, proportion of items in each administration that are common, proportion of each item "type" in each scale, proportion of each item type among common items used for equating, scale linking strategy, and change in ability from wave 1 to wave 2. Common item scale linking performed very well, even under circumstances that were far from ideal. Findings suggest that the merits of less biased measurements may outweigh the problems caused by slight adjustments to previously reported scores. It is recommended that long-term trend lines be periodically reanalyzed using methods such as multiple-group item response theory that can minimize such biases. (Contains 29 tables, 42 figures, and 9 references.) (SLD)
Descriptors: Computer Simulation, Equated Scores, Error of Measurement, Item Response Theory, National Surveys, Validity
ED Pubs, P.O. Box 1398, Jessup, MD 20794-1398. Tel: 877-433-7827 (Toll Free); Web site:
Publication Type: Reports - Research
Education Level: N/A
Authoring Institution: National Center for Education Statistics (ED), Washington, DC.
Identifiers - Assessments and Surveys: National Assessment of Educational Progress