ERIC Number: EJ1092442
Record Type: Journal
Publication Date: 2016
Abstractor: As Provided
Maintaining Equivalent Cut Scores for Small Sample Test Forms
Dwyer, Andrew C.
Journal of Educational Measurement, v53 n1 p3-22 Spr 2016
This study examines the effectiveness of three approaches for maintaining equivalent performance standards across test forms with small samples: (1) common-item equating, (2) resetting the standard, and (3) rescaling the standard. Rescaling the standard (i.e., applying common-item equating methodology to standard setting ratings to account for systematic differences between standard setting panels) has received almost no attention in the literature. Identity equating was also examined to provide context. Data from a standard setting form of a large national certification test (N examinees = 4,397; N panelists = 13) were split into content-equivalent subforms with common items, and resampling methodology was used to investigate the error introduced by each approach. Common-item equating (circle-arc and nominal weights mean) was evaluated at samples of size 10, 25, 50, and 100. The standard setting approaches (resetting and rescaling the standard) were evaluated by resampling (N = 8) and by simulating panelists (N = 8, 13, and 20). Results were inconclusive regarding the relative effectiveness of resetting and rescaling the standard. Small-sample equating, however, consistently produced new form cut scores that were less biased and less prone to random error than new form cut scores based on resetting or rescaling the standard.
Descriptors: Cutting Scores, Equivalency Tests, Test Format, Academic Standards, National Competency Tests, Error of Measurement, Test Bias, Evaluation Research, Evaluation Methods, Test Reliability, Test Validity
Wiley-Blackwell. 350 Main Street, Malden, MA 02148. Tel: 800-835-6770; Tel: 781-388-8598; Fax: 781-388-8232; e-mail: firstname.lastname@example.org; Web site: http://www.wiley.com/WileyCDA
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A