NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing all 6 results
Peer reviewed Peer reviewed
Direct linkDirect link
Gierl, Mark J.; Lai, Hollis – Educational Measurement: Issues and Practice, 2013
Changes to the design and development of our educational assessments are resulting in the unprecedented demand for a large and continuous supply of content-specific test items. One way to address this growing demand is with automatic item generation (AIG). AIG is the process of using item models to generate test items with the aid of computer…
Descriptors: Educational Assessment, Test Items, Automation, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Roberts, Mary Roduta; Gierl, Mark J. – Educational Measurement: Issues and Practice, 2010
This paper presents a framework to provide a structured approach for developing score reports for cognitive diagnostic assessments ("CDAs"). Guidelines for reporting and presenting diagnostic scores are based on a review of current educational test score reporting practices and literature from the area of information design. A sample diagnostic…
Descriptors: Diagnostic Tests, Scores, Technical Writing, Cognitive Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Leighton, Jacqueline P.; Gierl, Mark J. – Educational Measurement: Issues and Practice, 2007
The purpose of this paper is to define and evaluate the categories of cognitive models underlying at least three types of educational tests. We argue that while all educational tests may be based--explicitly or implicitly--on a cognitive model, the categories of cognitive models underlying tests often range in their development and in the…
Descriptors: Identification (Psychology), Misconceptions, Measurement, Inferences
Peer reviewed Peer reviewed
Direct linkDirect link
Gierl, Mark J. – Educational Measurement: Issues and Practice, 2005
In this paper I describe and illustrate the Roussos-Stout (1996) multidimensionality-based DIF analysis paradigm, with emphasis on its implication for the selection of a matching and studied subtest for DIF analyses. Standard DIF practice encourages an exploratory search for matching subtest items based on purely statistical criteria, such as a…
Descriptors: Models, Test Items, Test Bias, Statistical Analysis
Peer reviewed Peer reviewed
Gierl, Mark J.; Bisanz, Jeffrey; Bisanz, Gay L.; Boughton, Keith A.; Khaliq, Shameem Nyla – Educational Measurement: Issues and Practice, 2001
Describes some recent advances in the study of differential group performance and illustrates some of the ways that substantive analyses can be integrated with multidimensional, bundles-based statistical methods using test specifications to organize the analyses. Highlights some future directions for the analysis of differential item functioning.…
Descriptors: Achievement Tests, Item Bias, Statistical Analysis
Peer reviewed Peer reviewed
Gierl, Mark J.; Leighton, Jacqueline P.; Hunka, Stephen M. – Educational Measurement: Issues and Practice, 2000
Discusses the logic of the rule-space model (K. Tatsuoka, 1983) as it applies to test development and analysis. The rule-space model is a statistical method for classifying examinees' test item responses into a set of attribute-mastery patterns associated with different cognitive skills. Directs readers to a tutorial that may be downloaded. (SLD)
Descriptors: Item Analysis, Item Response Theory, Test Construction, Test Items