NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ928237
Record Type: Journal
Publication Date: 2011
Pages: 21
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1062-7197
EISSN: N/A
Do Examinees Understand Score Reports for Alternate Methods of Scoring Computer Based Tests?
Whittaker, Tiffany A.; Williams, Natasha J.; Dodd, Barbara G.
Educational Assessment, v16 n2 p69-89 2011
This study assessed the interpretability of scaled scores based on either number correct (NC) scoring for a paper-and-pencil test or one of two methods of scoring computer-based tests: an item pattern (IP) scoring method and a method based on equated NC scoring. The equated NC scoring method for computer-based tests was proposed as an alternative to item response theory IP scoring to improve consumers' understanding of reported scores. Results indicated that a test taker's performance level and the method used to report scores may adversely affect participants' correct interpretation of test scores. The IP score report made it difficult for participants to interpret the reported scores. The equated NC scoring method did not yield score reports that are any easier to understand than the IP scoring method. (Contains 9 tables.)
Routledge. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Research; Tests/Questionnaires
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A