NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
PDF on ERIC Download full text
ERIC Number: EJ1168459
Record Type: Journal
Publication Date: 2017-Dec
Pages: 13
Abstractor: As Provided
ISSN: EISSN-2330-8516
An Empirical Investigation of the Potential Impact of Item Misfit on Test Scores. Research Report. ETS RR-17-60
Kim, Sooyeon; Robin, Frederic
ETS Research Report Series, Dec 2017
In this study, we examined the potential impact of item misfit on the reported scores of an admission test from the subpopulation invariance perspective. The target population of the test consisted of 3 major subgroups with different geographic regions. We used the logistic regression function to estimate item parameters of the operational items based on the empirical data accumulated over 3 years. A new set of item parameter estimates derived using the data from each subgroup separately was compared to the original (i.e., operational) item parameter estimates to assess the degree of item misfit due to subgroup memberships. Using the new set of item parameter estimates for each subgroup, we also updated the conversion tables, which were derived from the original item parameter estimates, and compared them to their original conversions to determine whether score invariance was achieved at the scaled score level. Score invariance was not absolutely achieved. Even so, the magnitude of reported score differences (systematic error or bias) caused by subgroup dependence was still smaller than the standard error of measurement (random error) of the test. This study suggests a practical remedy for enhancing the level of score invariance of the test.
Educational Testing Service. Rosedale Road, MS19-R Princeton, NJ 08541. Tel: 609-921-9000; Fax: 609-734-5410; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A