NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1153006
Record Type: Journal
Publication Date: 2017-Sep
Pages: 13
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1045-3830
EISSN: N/A
Technical Adequacy of Growth Estimates from a Computer Adaptive Test: Implications for Progress Monitoring
Van Norman, Ethan R.; Nelson, Peter M.; Parker, David C.
School Psychology Quarterly, v32 n3 p379-391 Sep 2017
Computer adaptive tests (CATs) hold promise to monitor student progress within multitiered systems of support. However, the relationship between how long and how often data are collected and the technical adequacy of growth estimates from CATs has not been explored. Given CAT administration times, it is important to identify optimal data collection schedules to minimize missed instructional time. We used simulation methodology to investigate how the duration and frequency of data collection influenced the reliability, validity, and precision of growth estimates from a math CAT. A progress monitoring dataset of 746 Grade 4, 664 Grade 5, and 400 Grade 6 students from 40 schools in the upper Midwest was used to generate model parameters. Across grades, 53% of students were female and 53% were White. Grade level was not as influential as the duration and frequency of data collection on the technical adequacy of growth estimates. Low-stakes decisions were possible after 14-18 weeks when data were collected weekly (420-540 min of assessment), 20-24 weeks when collected every other week (300-360 min of assessment), and 20-28 weeks (150-210 min of assessment) when data were collected once a month, depending on student grade level. The validity and precision of growth estimates improved when the duration and frequency of progress monitoring increased. Given the amount of time required to obtain technically adequate growth estimates in the present study, results highlight the importance of weighing the potential costs of missed instructional time relative to other types of assessments, such as curriculum-based measures. Implications for practice, research, as well as future directions are also discussed.
American Psychological Association. Journals Department, 750 First Street NE, Washington, DC 20002. Tel: 800-374-2721; Tel: 202-336-5510; Fax: 202-336-5502; e-mail: order@apa.org; Web site: http://www.apa.org
Publication Type: Journal Articles; Reports - Research
Education Level: Grade 4
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A