ERIC Number: EJ812993
Record Type: Journal
Publication Date: 2004-Mar
Abstractor: As Provided
Reference Count: 0
Detecting Score Drift in a High-Stakes Performance-Based Assessment
McKinley, Danette W.; Boulet, John R.
Advances in Health Sciences Education, v9 n1 p29-38 Mar 2004
Although studies have been conducted to examine the effects of a variety of factors on the comparability of scores obtained from standardized patient examinations (SPE), little research has been conducted to specifically investigate the challenge of detecting drift in case difficulty estimates over time, particularly for large-scale, performance-based, assessments. The purpose of the current study was to investigate the use of a procedure to detect drift in the difficulty estimates for a large-scale, high stakes SPE. The results of this investigation suggest that, for particular performance tasks, there was some variation in mean scores over time. These findings indicate that, although it is feasible to create a bank of case-SP means and link scores back to these fixed estimates, special attention must be paid to the standardization of exam materials over time. This is essential to ensure comparability of scores and pass-fail decisions for candidates who are assessed on multiple test forms throughout the year.
Descriptors: Item Response Theory, Scores, High Stakes Tests, Performance Based Assessment, Standardized Tests, Comparative Analysis
Springer. 233 Spring Street, New York, NY 10013. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-348-4505; e-mail: email@example.com; Web site: http://www.springerlink.com
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Authoring Institution: N/A