NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
PDF on ERIC Download full text
ERIC Number: EJ1045988
Record Type: Journal
Publication Date: 2014-Nov
Pages: 16
Abstractor: As Provided
Reference Count: 38
ISSN: EISSN-1492-3831
Evaluating the Validity and Applicability of Automated Essay Scoring in Two Massive Open Online Courses
Reilly, Erin Dawna; Stafford, Rose Eleanore; Williams, Kyle Marie; Corliss, Stephanie Brooks
International Review of Research in Open and Distance Learning, v15 n5 p83-98 Nov 2014
The use of massive open online courses (MOOCs) to expand students' access to higher education has raised questions regarding the extent to which this course model can provide and assess authentic, higher level student learning. In response to this need, MOOC platforms have begun utilizing automated essay scoring (AES) systems that allow students to engage in critical writing and free-response activities. However, there is a lack of research investigating the validity of such systems in MOOCs. This research examined the effectiveness of an AES tool to score writing assignments in two MOOCs. Results indicated that some significant differences existed between Instructor grading, AES-Holistic scores, and AES-Rubric Total scores within two MOOC courses. However, use of the AES system may still be useful given instructors' assessment needs and intent. Findings from this research have implications for instructional technology administrators, educational designers, and instructors implementing AES learning activities in MOOC courses.
Athabasca University. 1200, 10011 - 109 Street, Edmonton, AB T5J 3S8, Canada. Tel: 780-421-2536; Fax: 780-497-3416; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Evaluative
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A