NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1073524
Record Type: Journal
Publication Date: 2015-Oct
Pages: 24
Abstractor: As Provided
ISSN: ISSN-0013-1644
Using a Model of Analysts' Judgments to Augment an Item Calibration Process
Hauser, Carl; Thum, Yeow Meng; He, Wei; Ma, Lingling
Educational and Psychological Measurement, v75 n5 p826-849 Oct 2015
When conducting item reviews, analysts evaluate an array of statistical and graphical information to assess the fit of a field test (FT) item to an item response theory model. The process can be tedious, particularly when the number of human reviews (HR) to be completed is large. Furthermore, such a process leads to decisions that are susceptible to human errors. A key finding from behavioral decision-making research has shown that a parametric model of human decision making often outperforms the decision maker himself. We exploit this finding by seeking a model to mimic how analysts integrate FT item level statistics and graphical performance plots to predict the analyst's assignment of the item's status. The procedure suggests a set of rules that achieves a desired level of classification accuracy, separating situations in which the evidence supports firm decisions from those situations that would likely benefit from HRs. Implementation of the decision rules accounts for an estimated 65% reduction in calibrations requiring HRs.
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: Elementary Secondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A