NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
PDF on ERIC Download full text
ERIC Number: EJ1062689
Record Type: Journal
Publication Date: 2012
Pages: 13
Abstractor: As Provided
Reference Count: 19
ISSN: EISSN-2161-4210
Generalizability of Student Writing across Multiple Tasks: A Challenge for Authentic Assessment
Hathcoat, John D.; Penn, Jeremy D.
Research & Practice in Assessment, v7 p16-28 Win 2012
Critics of standardized testing have recommended replacing standardized tests with more authentic assessment measures, such as classroom assignments, projects, or portfolios rated by a panel of raters using common rubrics. Little research has examined the consistency of scores across multiple authentic assignments or the implications of this source of error on the generalizability of assessment results. This study provides a framework for conceptualizing measurement error when using authentic assessments and investigates the extent to which student writing performance may generalize across multiple tasks. Results from a generalizability study found that 77% of error variance may be attributable to differences within people across multiple writing assignments. Decision studies indicated that substantive improvements in reliability may be gained by increasing the number of assignments, as opposed to increasing the number of raters. Judgments about relative student performance may require closer scrutiny of task characteristics as a source of measurement error.
Virginia Assessment Group. Tel: 504-314-2898; Fax: 504-247-1232; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Oklahoma