NotesFAQContact Us
Search Tips
ERIC Number: ED442836
Record Type: Non-Journal
Publication Date: 2000-Apr-26
Pages: 40
Abstractor: N/A
Reference Count: N/A
An Investigation of the Cognitive Equivalence of Computerized and Paper-and-Pencil Reading Comprehension Test Items.
Kobrin, Jennifer L.
The comparability of computerized and paper-and-pencil tests was examined from cognitive perspective, using verbal protocols rather than psychometric methods, as the primary mode of inquiry. Reading comprehension items from the Graduate Record Examinations were completed by 48 college juniors and seniors, half of whom took the computerized test first followed by the paper-and-pencil version, and half of whom took the paper-and-pencil test before the computerized test. Participants were asked to think aloud as they answered the test questions. The verbal protocols were transcribed and coded for interpretation. There was a greater frequency of reading comprehension utterances during the paper-and-pencil test, but these were largely accounted for by the use of physical aids to identify important information in the passage. Many participants said that they felt disadvantaged during the computerized test by not being able to write on the passage and test questions. The frequently used strategy of marking the test did not seem to produce any cognitive benefits, however. There was slight evidence of a working memory load while answering the questions on the computerized tests, but overall there were few mode differences and the magnitude of differences was very small. Nearly all participants used the same overall test-taking strategy on both test formats. The first test given, which was less interesting and more difficult, exposed more of the mode effects than the more interesting second test. An appendix contains a chart of coding categories at the utterance level. (Contains 10 tables and 30 references.) (SLD)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Graduate Record Examinations