ERIC Number: EJ928237
Record Type: Journal
Publication Date: 2011
Abstractor: As Provided
Reference Count: 29
Do Examinees Understand Score Reports for Alternate Methods of Scoring Computer Based Tests?
Whittaker, Tiffany A.; Williams, Natasha J.; Dodd, Barbara G.
Educational Assessment, v16 n2 p69-89 2011
This study assessed the interpretability of scaled scores based on either number correct (NC) scoring for a paper-and-pencil test or one of two methods of scoring computer-based tests: an item pattern (IP) scoring method and a method based on equated NC scoring. The equated NC scoring method for computer-based tests was proposed as an alternative to item response theory IP scoring to improve consumers' understanding of reported scores. Results indicated that a test taker's performance level and the method used to report scores may adversely affect participants' correct interpretation of test scores. The IP score report made it difficult for participants to interpret the reported scores. The equated NC scoring method did not yield score reports that are any easier to understand than the IP scoring method. (Contains 9 tables.)
Descriptors: Computer Assisted Testing, Scoring, Test Interpretation, Equated Scores, Item Response Theory, Undergraduate Students
Routledge. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Research; Tests/Questionnaires
Education Level: Higher Education; Postsecondary Education
Authoring Institution: N/A