ERIC Number: ED397076
Record Type: RIE
Publication Date: 1995-Jul
Reference Count: N/A
Evaluating a Prototype Essay Scoring Procedure Using Off-the-Shelf Software.
Kaplan, Randy M.; And Others
The increased use of constructed-response items, like essays, creates a need for tools to score these responses automatically in part or as a whole. This study explores one approach to analyzing essay-length natural language constructed-responses. A decision model for scoring essays was developed and evaluated. The decision model uses off-the-shelf software for grammar and style checking of the English language. The best performing grammar checking programs from among several commercial programs were selected to construct a decision model for scoring the essays. Data produced from the selected grammar programs were used to make a decision about the score for an essay. Through statistical and linguistic methods, the performance of the decision model was analyzed in an effort to understand its usefulness and practicality in a production scoring setting. A sample of 80 essays was selected from Test of Written English essays prepared for the Test of English as a Foreign Language. Using four grammar-checking programs, 320 analyses were produced. Results indicated that a model could be constructed using the commercial programs and that about 30% of the essays could be scored correctly. Scores derived from the scoring model could be accepted as accurate, but the number of essays scored does not yet warrant its application in a practical setting. Three appendixes contain sample grammar check outputs, a categorization of errors from the grammar checkers, and essay analysis data. (Contains 16 tables, 5 figures, and 6 references.) (Author/SLD)
Publication Type: Reports - Evaluative
Education Level: N/A
Authoring Institution: Educational Testing Service, Princeton, NJ.
Identifiers: Commercially Prepared Materials; Decision Models; Grammar Checkers; Test of English as a Foreign Language