NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1273553
Record Type: Journal
Publication Date: 2020
Pages: 16
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0895-7347
EISSN: N/A
Validating Rubric Scoring Processes: An Application of an Item Response Tree Model
Myers, Aaron J.; Ames, Allison J.; Leventhal, Brian C.; Holzman, Madison A.
Applied Measurement in Education, v33 n4 p293-308 2020
When rating performance assessments, raters may ascribe different scores for the same performance when rubric application does not align with the intended application of the scoring criteria. Given performance assessment score interpretation assumes raters apply rubrics as rubric developers intended, misalignment between raters' scoring processes and the intended scoring processes may lead to invalid inferences from these scores. In an effort to standardize raters' scoring processes, an alternative scoring method was used. With this method, rubric developers' intended scoring processes are made explicit by requiring raters to respond to a series of selected-response statements resembling a decision tree. To determine if raters scored essays as intended using a traditional rubric and the alternative scoring method, an IRT model with a tree-like structure (IRTree) was specified to depict the intended scoring processes and fit to data from each scoring method. Results suggest raters using the alternative method may be better able to rate as intended and thus the alternative method may be a viable alternative to traditional rubric scoring. Implications of the IRTree model are discussed.
Routledge. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A