ERIC Number: EJ1025252
Record Type: Journal
Publication Date: 2014
Pages: 19
Abstractor: As Provided
Reference Count: 21
ISBN: N/A
ISSN: ISSN-1530-5058
Using Automated Essay Scores as an Anchor When Equating Constructed Response Writing Tests
Almond, Russell G.
International Journal of Testing, v14 n1 p73-91 2014
Assessments consisting of only a few extended constructed response items (essays) are not typically equated using anchor test designs as there are typically too few essay prompts in each form to allow for meaningful equating. This article explores the idea that output from an automated scoring program designed to measure writing fluency (a common objective of many writing prompts) can be used in place of a more traditional anchor. The linear-logistic equating method used in this article is a variant of the Tucker linear equating method appropriate for the limited score range typical of essays. The procedure is applied to historical data. Although the procedure only results in small improvements over identity equating (not equating prompts), it does produce a viable alternative, and a mechanism for checking that the identity equating is appropriate. This may be particularly useful for measuring rater drift or equating mixed format tests.
Descriptors: Automation, Equated Scores, Writing Tests, Essay Tests, Scoring, College Entrance Examinations
Routledge. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers: Graduate Record Examinations

Peer reviewed
Direct link
