ERIC Number: EJ1255534
Record Type: Journal
Publication Date: 2020
Pages: 14
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0022-0655
EISSN: N/A
Examining the Precision of Cut Scores within a Generalizability Theory Framework: A Closer Look at the Item Effect
Clauser, Brian E.; Kane, Michael; Clauser, Jerome C.
Journal of Educational Measurement, v57 n2 p216-229 Sum 2020
An Angoff standard setting study generally yields judgments on a number of items by a number of judges (who may or may not be nested in panels). Variability associated with judges (and possibly panels) contributes error to the resulting cut score. The variability associated with items plays a more complicated role. To the extent that the mean item judgments directly reflect empirical item difficulties, the variability in Angoff judgments over items would not add error to the cut score, but to the extent that the mean item judgments do not correspond to the empirical item difficulties, variability in mean judgments over items would add error to the cut score. In this article, we present two generalizability-theory-based analyses of the proportion of the item variance that contributes to error in the cut score. For one approach, variance components are estimated on the probability (or proportion-correct) scale of the Angoff judgments, and for the other, the judgments are transferred to the theta scale of an item response theory model before estimating the variance components. The two analyses yield somewhat different results but both indicate that it is not appropriate to simply ignore the item variance component in estimating the error variance.
Descriptors: Cutting Scores, Generalization, Decision Making, Standard Setting, Evaluators, Item Analysis, Error of Measurement, Difficulty Level, Probability, Item Response Theory, Guidelines
Wiley-Blackwell. 350 Main Street, Malden, MA 02148. Tel: 800-835-6770; Tel: 781-388-8598; Fax: 781-388-8232; e-mail: cs-journals@wiley.com; Web site: http://www.wiley.com/WileyCDA
Publication Type: Journal Articles; Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A