NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1005375
Record Type: Journal
Publication Date: 2013
Pages: 22
Abstractor: As Provided
Reference Count: 45
ISSN: ISSN-1530-5058
Assessing the Item Response Theory with Covariate (IRT-C) Procedure for Ascertaining Differential Item Functioning
Tay, Louis; Vermunt, Jeroen K.; Wang, Chun
International Journal of Testing, v13 n3 p201-222 2013
We evaluate the item response theory with covariates (IRT-C) procedure for assessing differential item functioning (DIF) without preknowledge of anchor items (Tay, Newman, & Vermunt, 2011). This procedure begins with a fully constrained baseline model, and candidate items are tested for uniform and/or nonuniform DIF using the Wald statistic. Candidate items are selected in turn based on high unconditional bivariate residual (UBVR) values. This iterative process continues until no further DIF is detected or the Bayes information criterion (BIC) increases. We expanded on the procedure and examined the use of conditional bivariate residuals (CBVR) to flag for DIF; aside from the BIC, alternative stopping criteria were also considered. Simulation results showed that the IRT-C approach for assessing DIF performed well, with the use of CBVR yielding slightly better power and Type I error rates than UBVR. Additionally, using no information criterion yielded higher power than using the BIC, although Type I error rates were generally well controlled in both cases. Across the simulation conditions, the IRT-C procedure produced results similar to the Mantel-Haenszel and MIMIC procedures. (Contains 4 tables.)
Routledge. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A