ERIC Number: EJ1037003
Record Type: Journal
Publication Date: 2014
Pages: 9
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-1554-9178
EISSN: N/A
Extending Item Response Theory to Online Homework
Kortemeyer, Gerd
Physical Review Special Topics - Physics Education Research, v10 n1 p010118-1-010118-9 Jan-Jun 2014
Item response theory (IRT) becomes an increasingly important tool when analyzing "big data" gathered from online educational venues. However, the mechanism was originally developed in traditional exam settings, and several of its assumptions are infringed upon when deployed in the online realm. For a large-enrollment physics course for scientists and engineers, the study compares outcomes from IRT analyses of exam and homework data, and then proceeds to investigate the effects of each confounding factor introduced in the online realm. It is found that IRT yields the correct trends for learner ability and meaningful item parameters, yet overall agreement with exam data is moderate. It is also found that learner ability and item discrimination is robust over a wide range with respect to model assumptions and introduced noise. Item difficulty is also robust, but over a narrower range.
Descriptors: Item Response Theory, Online Courses, Electronic Learning, Homework, Robustness (Statistics), Physics, Large Group Instruction, Test Items, Models, Computer Assisted Testing, Maximum Likelihood Statistics, Academic Ability, Problem Solving, Difficulty Level, Guessing (Tests), Duplication, Item Analysis
American Physical Society. One Physics Ellipse 4th Floor, College Park, MD 20740-3844. Tel: 301-209-3200; Fax: 301-209-0865; e-mail: assocpub@aps.org; Web site: http://prst-per.aps.org
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A