PDF pending restoration
ERIC Number: ED349319
Record Type: Non-Journal
Publication Date: 1991-Apr
Reference Count: N/A
Quantitative Comparisons of Difficulty, Discrimination and Reliability of Machine-Scored Completion Items and Tests (in the MDT Un-Cued Answer-Bank Format) in Contrast with Statistics from Comparable Multiple Choice Questions: The First Round of Results.
Anderson, Paul S.; Hyers, Albert D.
Three descriptive statistics (difficulty, discrimination, and reliability) of multiple-choice (MC) test items were compared to those of a new (1980s) format of machine-scored questions. The new method, answer-bank multi-digit testing (MDT), uses alphabetized lists of up to 1,000 alternatives and approximates the completion style of assessment items. Five data sets were analyzed with a total of over 500 postsecondary students. Each student answered between 13 and 30 pairs (total of 110 pairs) of identical questions in both MC and MDT formats. Quantitative measures, including correlation and significance of difference in means, reveal that in comparison to MC questions, the MDT items predictably are: (1) more difficult; (2) consistently better discriminators (higher validity); and (3) continually better providers of test reliability. The evidence firmly indicates that the MDT answer bank format of responses is superior in these quantitative measures to the MC format for questions where either could be used appropriately. The MDT format could be widely used in education to improve measurement without raising costs. One table and four appendixes are included. Appendixes 1 and 2 provide background material and a description of the data sets. Appendixes 3 and 4 contain 15 tables of statistics describing the data sets. (Author/SLD)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Authoring Institution: N/A