NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ963552
Record Type: Journal
Publication Date: 2012-May
Pages: 22
Abstractor: As Provided
Reference Count: 39
ISSN: ISSN-0146-6216
The Performance of IRT Model Selection Methods with Mixed-Format Tests
Whittaker, Tiffany A.; Chang, Wanchen; Dodd, Barbara G.
Applied Psychological Measurement, v36 n3 p159-180 May 2012
When tests consist of multiple-choice and constructed-response items, researchers are confronted with the question of which item response theory (IRT) model combination will appropriately represent the data collected from these mixed-format tests. This simulation study examined the performance of six model selection criteria, including the likelihood ratio test, Akaike's information criterion (AIC), corrected AIC, Bayesian information criterion, Hannon and Quinn's information criterion, and consistent AIC, with respect to correct model selection among a set of three competing mixed-format IRT models (i.e., one-parameter logistic/partial credit [1PL/PC], two-parameter logistic/generalized partial credit [2PL/GPC], and three-parameter logistic/generalized partial credit [3PL/GPC]). The criteria were able to correctly select less parameterized IRT models, including the PC, 1PL, and 1PL/PC models. In contrast, the criteria were less able to correctly select more parameterized IRT models, including the GPC, 3PL, and 3PL/GPC models. Implications of the findings and recommendations are discussed. (Contains 9 figures, 3 tables, and 4 notes.)
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: Elementary Secondary Education; Grade 12; Grade 4; Grade 8
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: National Assessment of Educational Progress