NotesFAQContact Us
Search Tips
ERIC Number: ED453266
Record Type: Non-Journal
Publication Date: 2001-Apr
Pages: 42
Abstractor: N/A
Reference Count: N/A
Determining the Representation of Constructed Response Items in Mixed-Item Format Exams.
Sykes, Robert C.; Truskosky, Denise; White, Hillory
The purpose of this research was to study the effect of the three different ways of increasing the number of points contributed by constructed response (CR) items on the reliability of test scores from mixed-item-format tests. The assumption of unidimensionality that underlies the accuracy of item response theory model-based standard error predictions of reliability was initially evaluated for these tests. Large samples of students who had taken mixed-format field tests in mathematics at grades 5 and 8 and writing at grades 3 and 8 were available from a state criterion-referenced testing program. The selection of subsets of items from test-blueprint-representative forms of similar content and difficulty permitted an evaluation of the effects of weighting CR items on total test scores relative to criterion scores of putatively greater generalizability. As expected, there was a cost in terms of precision of having fewer, though weighted, CR items across a wide range of ability. The increment in standard error attributed to weighting was predictably less in the middle of the scale where the forms were targeted. The magnitude of the increase in error and the particular portion of the scale where it occurs are determined by the locations and amount of information contributed by the deleted CR items relative to those that are retained. Implications of different approaches to weighting are discussed. (Contains 5 tables, 10 figures, and 10 references.) (SLD)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A