NotesFAQContact Us
Search Tips
ERIC Number: ED400291
Record Type: Non-Journal
Publication Date: 1996-Apr
Pages: 17
Abstractor: N/A
Inter-rater Reliability on Various Types of Assessments Scored by School District Staff.
Myerberg, N. James
The Montgomery County (Maryland) public school system has started using assessments other than multiple-choice tests because it is felt that this will provide school staff with better information about the success of the instructional program. One of the ways assessments can provide better information is by having teachers score student papers. This, however, can conflict with another goal of the assessment program, high-stakes accountability for schools. An immediate solution to this potential source of conflict has been to have teachers score the papers in a centralized setting with extensive training and control, including the random assignment of papers. The three tests that the system has scored in this way are mathematics short-answer, mathematics extended-answer, and language arts extended-answer tests. Scorers had intensive training and close monitoring. Scoring consistency was evaluated by correlations between scorers and the percent of large differences between scorers. Reliability results indicate that constant, active monitoring is required to achieve consistent scoring, and that it is more difficult to score language arts assessments consistently than mathematics assessments. Attachments present the mathematics scoring rubric and a sample scoring report a rater would receive. (Contains three tables.) (SLD)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A