NotesFAQContact Us
Collection
Advanced
Search Tips
ERIC Number: ED516345
Record Type: Non-Journal
Publication Date: 2010
Pages: 124
Abstractor: As Provided
Reference Count: 0
ISBN: ISBN-978-1-1097-6061-3
ISSN: N/A
Comparability of Paper-and-Pencil and Computer-Based Cognitive and Non-Cognitive Measures in a Low-Stakes Testing Environment
Rowan, Barbara E.
ProQuest LLC, Ph.D. Dissertation, James Madison University
Computerized versions of paper-and-pencil tests (PPT) have emerged over the past few decades, and some practitioners are using both formats concurrently. But computerizing a PPT may not yield equivalent scores across the two administration modes. Comparability studies are required to determine if the scores are equivalent before treating them as such. These studies ensure fairer testing and more valid interpretations, regardless of the administration mode used. The purpose of this study was to examine whether scores from paper-based and computer-based versions of a cognitive and a non-cognitive measure were equivalent and could be used interchangeably. Previous research on test score comparability used simple methodology that provided insufficient evidence for the score equivalence. This study, however, demonstrated a set of methodological best practices, providing a more complex and accurate analysis of the degree of measurement invariance that exists across groups. The computer-based test (CBT) and PPT contained identical content and varied only in administration mode. Participants took the tests in only one format, and the administration was under low-stakes conditions. Confirmatory factor analyses were conducted to confirm the established factor structure for both the cognitive and the noncognitive measures, and reliability and mean differences were checked for each subscale. Scalar, metric, and configural invariance were tested across groups for both measures. Because of the potential impact on measurement invariance, differential item functioning (DIF) was tested for the cognitive measure, and those items were removed from the data set; measurement invariance across test modes was again evaluated. Results indicate that both the cognitive and the non-cognitive measures were metric invariant (essentially tau-equivalent) across groups, and the DIF items did not impact the degree of measurement invariance found for the cognitive measure. Therefore, the same construct was measured to the same degree, but scores are not equivalent without rescaling. Measurement invariance is a localized issue, thus, comparability must be evaluated for each instrument. Practitioners cannot assume that the scores obtained from the PPT and CBT will be equivalent. How the test scores are used will determine what changes must be made with tests that have less than strict measurement invariance. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com/en-US/products/dissertations/individuals.shtml.]
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A