ERIC Number: ED267127
Record Type: RIE
Publication Date: 1986-Apr
Reference Count: 0
Essay Topic Difficulty in Relation to Scoring Models.
Dovell, Patricia; Buhr, Dianne C.
This study examined the difficulty level of essay topics used in the large-scale assessment of writing in relation to five different scoring models, and sought to determine what effects the scoring models would have on passing rates. In model one, examinee's score is the direct result of a score assigned by the reader or the sum of scores assigned by multiple readers. Model two reports a composite writing score which combines a score from a direct assessment with a score from an indirect assessment. Model three adjusts the essay score or combined score by using the objective measure to compute the regression coefficient. Model four scores polychotomous items, such as essays (e.g. Rasch Model). Model five corrects raw scores for differences among raters adjusted by generalizability theory. Models one and two produced essentially equivalent pass/fail decisions. Model three was found to be inappropriate for making decisions about individual students because of the essay scores' discrete scale. Models four and five could not be applied because our data did not fit the design. Results showed that each of the models has its advantages depending upon the purpose of the assessment and the nature of the data. (PN)
Publication Type: Speeches/Meeting Papers; Reports - Research
Education Level: N/A
Authoring Institution: N/A
Note: Paper presented at the Annual Meeting of the American Educational Research Association (70th, San Francisco, CA, April 16-20, 1986).