NotesFAQContact Us
Collection
Advanced
Search Tips
ERIC Number: ED401323
Record Type: Non-Journal
Publication Date: 1996-Apr
Pages: 22
Abstractor: N/A
Reference Count: N/A
ISBN: N/A
ISSN: N/A
Conditional Standard Errors, Reliability and Decision Consistency of Performance Levels Using Polytomous IRT.
Wang, Tianyou; And Others
M. J. Kolen, B. A. Hanson, and R. L. Brennan (1992) presented a procedure for assessing the conditional standard error of measurement (CSEM) of scale scores using a strong true-score model. They also investigated the ways of using nonlinear transformation from number-correct raw score to scale score to equalize the conditional standard error along the reported score scale. Kolen, L. Zeng, and Hanson (in press) presented a similar procedure for assessing CSEM using item response theory (IRT) techniques. This paper extends that procedure to tests with polytomous items using a polytomous IRT model approach. A polytomous IRT-based procedure to assessing decision consistency of performance level classification based on alternate test forms is also described. The general approach for assessing CSEM and reliability is to obtain the probability distribution of the level score conditioned on a given theta and then compute the conditional mean and conditional standard deviation (or variance) of the scale scores or the level scores. The CSEM of the level score is the conditional standard deviation. Data from the American College Testing Program's Work Keys assessment for 7,097, 2,035, and 1,793 samples illustrate the use of the procedures. Model fit, classification consistency, and reliability were evaluated and found acceptable. Results suggest that the new level scores have higher reliability and classification consistency than the old level scores, indicating the usefulness of these polytomous IRT-based procedures. (Contains 4 tables, 5 figures, and 16 references.) (SLD)
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Work Keys (ACT)