ERIC Number: EJ1043691
Record Type: Journal
Publication Date: 2014
Abstractor: As Provided
Comparison of Integrated Testlet and Constructed-Response Question Formats
Slepkov, Aaron D.; Shiell, Ralph C.
Physical Review Special Topics - Physics Education Research, v10 n2 p020120-1-020120-15 Jul-Dec 2014
Constructed-response (CR) questions are a mainstay of introductory physics textbooks and exams. However, because of the time, cost, and scoring reliability constraints associated with this format, CR questions are being increasingly replaced by multiple-choice (MC) questions in formal exams. The integrated testlet (IT) is a recently developed question structure designed to provide a proxy of the pedagogical advantages of CR questions while procedurally functioning as set of MC questions. ITs utilize an answer-until-correct response format that provides immediate confirmatory or corrective feedback, and they thus allow not only for the granting of partial credit in cases of initially incorrect reasoning, but, furthermore, the ability to build cumulative question structures. Here, we report on a study that directly compares the functionality of ITs and CR questions in introductory physics exams. To do this, CR questions were converted to concept-equivalent ITs, and both sets of questions were deployed in midterm and final exams. We find that both question types provide adequate discrimination between stronger and weaker students, with CR questions discriminating slightly better than the ITs. There is some indication that any difference in discriminatory power may result from the baseline score for guessing that is inherent in MC testing. Meanwhile, an analysis of interrater scoring of the CR questions raises serious concerns about the reliability of the granting of partial credit when this traditional assessment technique is used in a realistic (but nonoptimized) setting. Furthermore, we show evidence that partial credit is granted in a valid manner in the ITs. Thus, together with consideration of the vastly reduced costs of administering IT-based examinations compared to CR-based examinations, our findings indicate that ITs are viable replacements for CR questions in formal examinations where it is desirable both to assess concept integration and to reward partial knowledge, while efficiently scoring examinations.
Descriptors: Science Tests, Physics, Responses, Multiple Choice Tests, Test Items, Introductory Courses, Discriminant Analysis, Interrater Reliability, Comparative Testing, Scoring, Psychometrics, Test Reliability, Test Validity, Credits, Foreign Countries, College Students
American Physical Society. One Physics Ellipse 4th Floor, College Park, MD 20740-3844. Tel: 301-209-3200; Fax: 301-209-0865; e-mail: firstname.lastname@example.org; Web site: http://prst-per.aps.org
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Authoring Institution: N/A
Identifiers - Location: Canada
Grant or Contract Numbers: N/A