Download full text
Download full text
ERIC Number: ED581626
Record Type: Non-Journal
Publication Date: 2018-Mar
Abstractor: As Provided
Comparability of Computer-Based and Paper-Based Science Assessments
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E.
Grantee Submission, Paper presented at the Annual International NARST Conference (Atlanta, GA, Mar 10-3, 2018)
We compared students' performance on a paper-based test (PBT) and three computer-based tests (CBTs). The three computer-based tests used different test navigation and answer selection features, allowing us to examine how these features affect student performance. The study sample consisted of 9,698 fourth through twelfth grade students from across the U.S. who were randomly assigned to take a test in one of the four modes. CBT modes differed in whether students could skip questions and freely move through the test, and whether students could click directly on the answer choice or had to click on a radio button at the bottom of the screen. Rasch analysis was used to estimate item difficulties and student performance levels. Student performance level was then used as an outcome in hiearchal linear models to determine the mode effects. We found that student performance was unaffected by whether the test was paper-based or computer-based. A comparison of student performance on the three CBTs indicated that restricting test navigation did not affect student performance, but allowing students to select an answer choice by directly clicking on it improved student performance. Our findings show that CBTs can be considered equivalent to PBTs, and the results can also be used to inform best practices for the design of other CBTs.
Publication Type: Speeches/Meeting Papers; Reports - Research
Education Level: Elementary Secondary Education; Grade 4; Grade 5; Grade 6; Grade 7; Grade 8; Grade 9; Grade 10; Grade 11; Grade 12
Sponsor: Institute of Education Sciences (ED)
Authoring Institution: N/A
IES Funded: Yes
Grant or Contract Numbers: R305A120138