ERIC Number: EJ1092254
Record Type: Journal
Publication Date: 2007
Abstractor: As Provided
Reference Count: 29
Administering Defining Issues Test Online: Do Response Modes Matter?
Xu, Yuejin; Iran-Nejad, Asghar; Thoma, Stephen J.
Journal of Interactive Online Learning, v6 n1 p10-27 Spr 2007
The purpose of the study was to determine comparability of an online version to the original paper-pencil version of Defining Issues Test 2 (DIT2). This study employed methods from both Classical Test Theory (CTT) and Item Response Theory (IRT). Findings from CTT analyses supported the reliability and discriminant validity of both versions. Findings from IRT analyses confirmed that both versions had comparable power of test-of-fit to the Rasch model. However, IRT analyses found that there were some variations in item difficulties and the patterns of item functions between the two versions. The study also examined students' satisfaction of DIT2-taking experience in the two survey response modes. The ANOVA results indicated that the online version of DIT2 was comparable to the paper-pencil version in terms of ease of use.
Descriptors: Computer Assisted Testing, Test Format, Comparative Analysis, Test Theory, Item Response Theory, Test Reliability, Test Validity, Statistical Analysis, Test Items, Difficulty Level, Student Attitudes, Student Characteristics, Graduate Students, Undergraduate Students, Correlation, Likert Scales, Surveys
University of Alabama. 152 Rose Administration, P.O. Box 870104, Tuscaloosa, AL 35487. Web site: http://www.ncolr.org/jiol
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Defining Issues Test