ERIC Number: ED427078
Record Type: Non-Journal
Publication Date: 1998-Sep
Reference Count: N/A
Problem Choice by Test Takers: Implications for Comparability and Construct Validity. CSE Technical Report 485.
Linn, Robert L.; Betebenner, Damian W.; Wheeler, Kerry S.
For assessments that present problems that require extended responses and substantial amounts of time, there is often a desire to allow students to choose which problem they will respond to among two or more options. Student choice of problem may allow students a better opportunity to demonstrate what they know and are able to do. On the other hand, choice raises questions about the comparability of scores obtained by students who respond to different problems. Questions of comparability and validity of scores obtained when students are given a choice among alternative problems were investigated using data for approximately 30,000 students from the Oregon State Assessment Program for Grade 10 Mathematics Assessment administered in spring 1997. The assessment consisted of a multiple-choice section and a pair of extended-response problems. On each of six alternate forms, two problems were presented, and students were instructed to choose one. Data from the six forms were analyzed to evaluate the comparability of scores obtained from responses to different tasks and the validity of the results. It was found that problems differed in popularity, and that the scores students obtained differed systematically as a function of problem choice. On the other hand, confirmatory factor analysis results across forms for students choosing different problems suggest that there was similar validity for measuring the underlying constructs across problem choice. It is concluded that while choice may be justified, some form of equating adjustments would be needed before making high-stakes decisions based on performance of students on problems where choice is allowed. (Contains 23 tables, 2 figures, and 9 references.) (Author/SLD)
Publication Type: Reports - Research
Education Level: N/A
Sponsor: Office of Educational Research and Improvement (ED), Washington, DC.
Authoring Institution: National Center for Research on Evaluation, Standards, and Student Testing, Los Angeles, CA.; Northern Illinois Univ., Oregon. Larado Taft Field Campus. Dept. of Outdoor Teacher Education.; Colorado Univ., Boulder.; California Univ., Los Angeles. Center for the Study of Evaluation.
Identifiers - Location: Oregon