ERIC Number: EJ1062843
Record Type: Journal
Publication Date: 2013
Pages: 9
Abstractor: ERIC
ISBN: N/A
ISSN: EISSN-2161-4210
EISSN: N/A
Available Date: N/A
Assessing Writing in MOOCs: Automated Essay Scoring and Calibrated Peer Review™
Balfour, Stephen P.
Research & Practice in Assessment, v8 p40-48 Sum 2013
Two of the largest Massive Open Online Course (MOOC) organizations have chosen different methods for the way they will score and provide feedback on essays students submit. EdX, MIT and Harvard's non-profit MOOC federation, recently announced that they will use a machine-based Automated Essay Scoring (AES) application to assess written work in their MOOCs. Coursera, a Stanford startup for MOOCs, has been skeptical of AES applications and therefore has held that it will use some form of human-based "calibrated peer review" to score and provide feedback on student writing. This essay reviews the relevant literature on AES and UCLA's Calibrated Peer Review™ (CPR) product at a high level, outlines the capabilities and limitations of both AES and CPR, and provides a table and framework for comparing these forms of assessment of student writing in MOOCs.
Descriptors: Online Courses, Writing Evaluation, Automation, Scoring, Peer Evaluation, Essays, Higher Education, Literature Reviews
Virginia Assessment Group. Tel: 504-314-2898; Fax: 504-247-1232; e-mail: editor@rpajournal.com; Web site: http://www.rpajournal.com/
Publication Type: Journal Articles; Reports - Evaluative; Information Analyses
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A