NotesFAQContact Us
Collection
Advanced
Search Tips
ERIC Number: ED558436
Record Type: Non-Journal
Publication Date: 2013-Aug
Pages: 21
Abstractor: As Provided
Reference Count: 39
ISBN: N/A
ISSN: N/A
Automatic Short Essay Scoring Using Natural Language Processing to Extract Semantic Information in the Form of Propositions. CRESST Report 831
Kerr, Deirdre; Mousavi, Hamid; Iseli, Markus R.
National Center for Research on Evaluation, Standards, and Student Testing (CRESST)
The Common Core assessments emphasize short essay constructed-response items over multiple-choice items because they are more precise measures of understanding. However, such items are too costly and time consuming to be used in national assessments unless a way to score them automatically can be found. Current automatic essay-scoring techniques are inappropriate for scoring the content of an essay because they either rely on grammatical measures of quality or machine learning techniques, neither of which identify statements of meaning (propositions) in the text. In this report, we introduce a novel technique for using domain-independent, deep natural language processing techniques to automatically extract meaning from student essays in the form of propositions and match the extracted propositions to the expected response. The empirical results indicate that our technique is able to accurately extract propositions from student short essays, reaching moderate agreement with human rater scores.
National Center for Research on Evaluation, Standards, and Student Testing (CRESST). 300 Charles E Young Drive N, GSE&IS Building 3rd Floor, Mailbox 951522, Los Angeles, CA 90095-1522. Tel: 310-206-1532; Fax: 310-825-3883; Web site: http://www.cresst.org
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Bill and Melinda Gates Foundation
Authoring Institution: National Center for Research on Evaluation, Standards, and Student Testing
IES Grant or Contract Numbers: OPP1003019