NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing all 4 results
Peer reviewed Peer reviewed
Direct linkDirect link
Passos, Valeria Lima; Berger, Martijn P. F.; Tan, Frans E. S. – Journal of Educational and Behavioral Statistics, 2008
During the early stage of computerized adaptive testing (CAT), item selection criteria based on Fisher"s information often produce less stable latent trait estimates than the Kullback-Leibler global information criterion. Robustness against early stage instability has been reported for the D-optimality criterion in a polytomous CAT with the…
Descriptors: Computer Assisted Testing, Adaptive Testing, Evaluation Criteria, Item Analysis
Peer reviewed Peer reviewed
Holman, Rebecca; Berger, Martijn P. F. – Journal of Educational and Behavioral Statistics, 2001
Studied calibration designs that maximize the determinants of Fisher's information matrix on the item parameters for sets of polytomously scored items. Analyzed these items using a number of item response theory models. Results show that for the data and models used, a D-optimal calibration design for an answer or set of answers can reduce the…
Descriptors: Item Response Theory, Research Design, Test Items
Peer reviewed Peer reviewed
Moerbeek, Mirjam; van Breukelen, J. P.; Berger, Martijn P. F. – Journal of Educational and Behavioral Statistics, 2000
Discusses the optimal level of randomization, the optimal allocation of units, and the budget for obtaining a certain power on a test of no treatment effect for populations with two or three levels of nesting and continuous outcomes. Focuses on the estimator of the regression coefficient associated with the treatment condition. (SLD)
Descriptors: Estimation (Mathematics), Power (Statistics), Regression (Statistics), Research Design
Peer reviewed Peer reviewed
Berger, Martijn P. F.; Veerkamp, Wim J. J. – Journal of Educational and Behavioral Statistics, 1997
Some alternative criteria for item selection in adaptive testing are proposed that take into account uncertainty in the ability estimates. A simulation study shows that the likelihood weighted information criterion is a good alternative to the maximum information criterion. Another good alternative uses a Bayesian expected a posteriori estimator.…
Descriptors: Ability, Adaptive Testing, Bayesian Statistics, Computer Assisted Testing