ERIC Number: EJ1113650
Record Type: Journal
Publication Date: 2016-Sep
Pages: 14
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1045-3830
EISSN: N/A
Concurrent Validity and Classification Accuracy of Curriculum-Based Measurement for Written Expression
Furey, William M.; Marcotte, Amanda M.; Hintze, John M.; Shackett, Caroline M.
School Psychology Quarterly, v31 n3 p369-382 Sep 2016
The study presents a critical analysis of written expression curriculum-based measurement (WE-CBM) metrics derived from 3- and 10-min test lengths. Criterion validity and classification accuracy were examined for Total Words Written (TWW), Correct Writing Sequences (CWS), Percent Correct Writing Sequences (%CWS), and Correct Minus Incorrect Writing Sequences (CMIWS). Fourth grade students (n = 109) from 6 schools participated in the study. To assess criterion validity of each metric, total scores from writing tasks were correlated with the state achievement test's composition subtest. Each index investigated was moderately correlated with the subtest. Correlations increased with the longer sampling period, however they were not statistically significant. The accuracy at distinguishing between proficient and not proficient writers on the state assessment was analyzed for each index using discriminant function analysis and Receiver Operating Characteristic (ROC) curves. CWS and CMIWS, indices encompassing production and accuracy, were most accurate for predicting proficiency. Improvements were observed in classification accuracy with an increased sampling time. Utilizing cut scores to hold sensitivity above 0.90, specificity for each metric increased with longer probes. Sensitivity and specificity increased for all metrics with longer probes when using a 25th percentile cut. Visual analyses of ROC curves reveal where classification improvements were made. The 10-min sample for CWS more accurately identified at-risk students in the center of the distribution. Without measurement guiding decisions, writers in the middle of the distribution are more difficult to classify than those who clearly write well or struggle. The findings have implications for screening using WE-CBM.
Descriptors: Curriculum Based Assessment, Classification, Accuracy, Test Validity, Grade 4, State Standards, Achievement Tests, Correlation, Criterion Referenced Tests, Screening Tests, Writing Exercises, Writing Ability, Task Analysis, Timed Tests, Language Arts
American Psychological Association. Journals Department, 750 First Street NE, Washington, DC 20002. Tel: 800-374-2721; Tel: 202-336-5510; Fax: 202-336-5502; e-mail: order@apa.org; Web site: http://www.apa.org
Publication Type: Journal Articles; Reports - Research
Education Level: Grade 4; Intermediate Grades; Elementary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A