Publication Date
| In 2015 | 0 |
| Since 2014 | 0 |
| Since 2011 (last 5 years) | 1 |
| Since 2006 (last 10 years) | 3 |
| Since 1996 (last 20 years) | 6 |
Descriptor
| Test Items | 3 |
| Educational Assessment | 2 |
| Educational Testing | 2 |
| Models | 2 |
| Statistical Analysis | 2 |
| Achievement Tests | 1 |
| Automation | 1 |
| Cognitive Measurement | 1 |
| Cognitive Processes | 1 |
| Cognitive Tests | 1 |
| More ▼ | |
Source
| Educational Measurement:… | 6 |
Author
| Gierl, Mark J. | 6 |
| Leighton, Jacqueline P. | 2 |
| Bisanz, Gay L. | 1 |
| Bisanz, Jeffrey | 1 |
| Boughton, Keith A. | 1 |
| Hunka, Stephen M. | 1 |
| Khaliq, Shameem Nyla | 1 |
| Lai, Hollis | 1 |
| Roberts, Mary Roduta | 1 |
Publication Type
| Journal Articles | 6 |
| Reports - Descriptive | 2 |
| Reports - Evaluative | 2 |
| Guides - Classroom - Learner | 1 |
| Reports - Research | 1 |
Education Level
Audience
Showing all 6 results
Gierl, Mark J.; Lai, Hollis – Educational Measurement: Issues and Practice, 2013
Changes to the design and development of our educational assessments are resulting in the unprecedented demand for a large and continuous supply of content-specific test items. One way to address this growing demand is with automatic item generation (AIG). AIG is the process of using item models to generate test items with the aid of computer…
Descriptors: Educational Assessment, Test Items, Automation, Computer Assisted Testing
Roberts, Mary Roduta; Gierl, Mark J. – Educational Measurement: Issues and Practice, 2010
This paper presents a framework to provide a structured approach for developing score reports for cognitive diagnostic assessments ("CDAs"). Guidelines for reporting and presenting diagnostic scores are based on a review of current educational test score reporting practices and literature from the area of information design. A sample diagnostic…
Descriptors: Diagnostic Tests, Scores, Technical Writing, Cognitive Tests
Leighton, Jacqueline P.; Gierl, Mark J. – Educational Measurement: Issues and Practice, 2007
The purpose of this paper is to define and evaluate the categories of cognitive models underlying at least three types of educational tests. We argue that while all educational tests may be based--explicitly or implicitly--on a cognitive model, the categories of cognitive models underlying tests often range in their development and in the…
Descriptors: Identification (Psychology), Misconceptions, Measurement, Inferences
Gierl, Mark J. – Educational Measurement: Issues and Practice, 2005
In this paper I describe and illustrate the Roussos-Stout (1996) multidimensionality-based DIF analysis paradigm, with emphasis on its implication for the selection of a matching and studied subtest for DIF analyses. Standard DIF practice encourages an exploratory search for matching subtest items based on purely statistical criteria, such as a…
Descriptors: Models, Test Items, Test Bias, Statistical Analysis
Peer reviewedGierl, Mark J.; Bisanz, Jeffrey; Bisanz, Gay L.; Boughton, Keith A.; Khaliq, Shameem Nyla – Educational Measurement: Issues and Practice, 2001
Describes some recent advances in the study of differential group performance and illustrates some of the ways that substantive analyses can be integrated with multidimensional, bundles-based statistical methods using test specifications to organize the analyses. Highlights some future directions for the analysis of differential item functioning.…
Descriptors: Achievement Tests, Item Bias, Statistical Analysis
Peer reviewedGierl, Mark J.; Leighton, Jacqueline P.; Hunka, Stephen M. – Educational Measurement: Issues and Practice, 2000
Discusses the logic of the rule-space model (K. Tatsuoka, 1983) as it applies to test development and analysis. The rule-space model is a statistical method for classifying examinees' test item responses into a set of attribute-mastery patterns associated with different cognitive skills. Directs readers to a tutorial that may be downloaded. (SLD)
Descriptors: Item Analysis, Item Response Theory, Test Construction, Test Items

Direct link
