NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
PDF on ERIC Download full text
ERIC Number: EJ1124777
Record Type: Journal
Publication Date: 2016-Jun
Pages: 17
Abstractor: As Provided
ISSN: EISSN-2330-8516
A Review of Evidence Presented in Support of Three Key Claims in the Validity Argument for the "TextEvaluator"® Text Analysis Tool. Research Report. ETS RR-16-12
Sheehan, Kathleen M.
ETS Research Report Series, Jun 2016
The "TextEvaluator"® text analysis tool is a fully automated text complexity evaluation tool designed to help teachers and other educators select texts that are consistent with the text complexity guidelines specified in the Common Core State Standards (CCSS). This paper provides an overview of the TextEvaluator measurement approach and summarizes evidence related to three key claims in the TextEvaluator validity argument: (a) TextEvaluator has succeeded in expanding construct coverage beyond the two dimensions of text variation that are traditionally assessed by readability metrics; (b) the TextEvaluator strategy of estimating distinct prediction models for informational, literary, and mixed texts has succeeded in generating text complexity predictions that exhibit little, if any, genre bias; and (c) TextEvaluator scores are highly correlated with text complexity judgments provided by human experts, including judgments generated via the inheritance method and judgments generated via the exemplar method. Implications with respect to the goal of helping teachers and other educators select texts that are closely aligned with the accelerated text complexity exposure trajectory outlined in the CCSS are discussed.
Educational Testing Service. Rosedale Road, MS19-R Princeton, NJ 08541. Tel: 609-921-9000; Fax: 609-734-5410; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A