NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED319800
Record Type: Non-Journal
Publication Date: 1990-Apr
Pages: 24
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Increasing Score Reliability with Item-Pattern Scoring: An Empirical Study in Several Score Metrics.
Yen, Wendy M.; Candell, Gregory L.
Reliabilities are compared for two types of test score data: number correct, and item response patterns. Item-pattern scoring using three-parameter item response theory takes into account how many and which items a student answers correctly. This procedure theoretically results in greater reliability than does number-correct scoring. Empirical reliabilities of scores based on the two types of data were compared within each of three score metrics: (1) scale score; (2) number correct; and (3) grade equivalent. The reliabilities obtained were based on at least 900 students in grades 4 through 8 for five content areas (reading comprehension, language expression, spelling, mathematics concepts and applications, and science) at 10 testing times for 50 replications in each score metric. Item-pattern data produced more reliable scores than did number-correct data in 49 replications in the scale score metric, 50 in the number correct metric, and 49 in the grade equivalent metric. The increases in reliability produced by item-patterning scoring were equivalent, on the average, to that expected by a 15% to 20% increase in test length. Eight data tables and three figures are included. (SLD)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A