NotesFAQContact Us
Search Tips
ERIC Number: ED334212
Record Type: Non-Journal
Publication Date: 1991-Apr
Pages: 34
Abstractor: N/A
Reference Count: N/A
The Degree of Person Misfit on a Nationally Standardized Achievement Test.
Yoes, Michael E.; Ho, Kevin T.
How two students with the same number-correct score, calculated by dramatically different patterns of responses, can achieve the same ability estimate is demonstrated. The present study investigated the incidence of person misfit to the model in a nationally standardized test of achievement devised using the Rasch item response theory model. It was hypothesized that if guessing occurs frequently and the Rasch model is not robust enough to handle it, then the percentage of examinees with aberrant response patterns (i.e., persons misfitting the model) should be high. Using student data for grades 3 through 12 from the science, reading comprehension, and spelling subtests of the spring 1988 administration of the Stanford Achievement Test (STAT) (Eighth Edition), response patterns were analyzed with three indicators of person fit (unweighted Rasch mean square fit, INFIT, and standardized likelihood). Data for 45,442 examinees on the science subtest, 47,040 examinees on the reading comprehension subtest, and 46,332 examinees on the spelling subtest were analyzed for each of the STAT and simulated data sets. The percentages of students in the national standardization sample identified by each indicator and by all three as potentially misfitting the Rasch model are presented along with a baseline analysis of model-fitting simulation data. Results of this study indicate that the percentage of students identified as misfitting the model is small. While the utility and choice of person fit indicators may be arguable, the data show the usefulness of the Rasch model for educational applications. Seven data tables and five graphs are included. (RLC)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Stanford Achievement Tests