NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ920494
Record Type: Journal
Publication Date: 2010-Nov
Pages: 5
Abstractor: As Provided
Reference Count: 24
ISSN: ISSN-0021-9584
Using Item Response Theory to Assess Changes in Student Performance Based on Changes in Question Wording
Schurmeier, Kimberly D.; Atwood, Charles H.; Shepler, Carrie G.; Lautenschlager, Gary J.
Journal of Chemical Education, v87 n11 p1268-1272 Nov 2010
Five years of longitudinal data for general chemistry student assessments at the University of Georgia have been analyzed using item response theory (IRT). Our analysis indicates that minor changes in question wording on exams can make significant differences in student performance on assessment questions. This analysis encompasses data from over 6100 students, giving an extremely small statistical uncertainty. IRT provided us with a new insight into student performance on our assessments that is also important to the chemical education community. In this paper, IRT, in conjunction with computerized testing, indicates how nuances in question wording impact student performance on assessments. (Contains 1 table and 3 figures.)
Division of Chemical Education, Inc and ACS Publications Division of the American Chemical Society. 1155 Sixteenth Street NW, Washington, DC 20036. Tel: 800-227-5558; Tel: 202-872-4600; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Descriptive
Education Level: Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Georgia