NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1089562
Record Type: Journal
Publication Date: 2016
Pages: 20
Abstractor: As Provided
ISSN: ISSN-1530-5058
Item Calibration Samples and the Stability of Achievement Estimates and System Rankings: Another Look at the PISA Model
Rutkowski, Leslie; Rutkowski, David; Zhou, Yan
International Journal of Testing, v16 n1 p1-20 2016
Using an empirically-based simulation study, we show that typically used methods of choosing an item calibration sample have significant impacts on achievement bias and system rankings. We examine whether recent PISA accommodations, especially for lower performing participants, can mitigate some of this bias. Our findings indicate that standard operational methods, while not ideal, recover underlying proficiency reasonably well and generally outperform methods that more completely include all participants. Translating results onto the PISA scale, the calibration sample can induce bias of up to 12.49 points, which is important given that standard errors are around three points. Although ranking correlations are at least 0.95, we note the policy implications of slight ranking changes. Our findings indicate that limited accommodations targeted at low achieving educational systems do not outperform either of the other methods considered. Research that further explores accommodations for heterogeneous populations is recommended.
Routledge. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Program for International Student Assessment
Grant or Contract Numbers: N/A