NotesFAQContact Us
Search Tips
ERIC Number: ED382650
Record Type: Non-Journal
Publication Date: 1994-Mar
Pages: 26
Abstractor: N/A
Reference Count: N/A
Models for Scoring Missing Responses to Multiple-Choice Items. Program Statistics Research Technical Report No. 94-1.
Longford, Nicholas T.
This study is a critical evaluation of the roles for coding and scoring of missing responses to multiple-choice items in educational tests. The focus is on tests in which the test-takers have little or no motivation; in such tests omitting and not reaching (as classified by the currently adopted operational rules) is quite frequent. Data from the 1991 National Assessment of Educational Progress (NAEP) Reading Assessment of 17-year-olds are used in analyses and illustrative examples. Alternative rules for scoring based on hypothesized behavior of the test-takers are proposed. The approach for incorporation of information about missing responses relies on a model relating knowledge categories (know or does not know) to the response categories (correct, incorrect, omitted, not reached, multiple). A computational algorithm is described that requires no new technology to be developed. Two tables and two figures describe the scoring approach. (Contains 2 references.) (Author/SLD)
Publication Type: Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Educational Testing Service, Princeton, NJ.
Identifiers - Assessments and Surveys: National Assessment of Educational Progress