NotesFAQContact Us
Search Tips
ERIC Number: ED170365
Record Type: Non-Journal
Publication Date: 1979-Apr
Pages: 36
Abstractor: N/A
Reference Count: N/A
Item Writing for Domain-Based Tests of Prose Learning.
Roid, Gale; And Others
Differences among test item writers and among different rules for writing multiple choice items were investigated. Items testing comprehension of a prose passage were varied according to five factors: (1) information density of the passage; (2) item writer; (3) deletion of nouns, as opposed to adjectives, from the sentence in order to construct the item; (4) verbatim versus paraphrased use of keywords; and (5) use of an algorithm for choosing foil words (distractors) versus item writer's choice of foil words. Two dependent variables were examined: item difficulty, and variability of item difficulty across item writers for a particular technique. Test forms contained multiple choice items on a 1,000-word prose passage. One form was administered as a pretest, and a different form was administered as a posttest after instruction (reading the passages); subjects were 423 third to sixth grade students. Results indicated that passage density led to increased difficulty, particularly when freedom in foil construction was allowed; and that it produced bias in item writers in spite of other efforts to minimize bias. Variability of item difficulty across different writers was reduced when algorithms for writing linguistic-based items were used. (GDC)
Publication Type: Speeches/Meeting Papers; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A