NotesFAQContact Us
Search Tips
ERIC Number: ED393935
Record Type: Non-Journal
Publication Date: 1995-Aug
Pages: 62
Abstractor: N/A
Reference Count: N/A
A Task Type for Measuring the Representational Component of Quantitative Proficiency. GRE Board Professional Report No. 92-05P.
Bennett, Randy Elliot; And Others
Two computer-based categorization tasks were developed and pilot tested. In study 1, the task asked examinees to sort mathematical word problem stems according to prototypes. Results with 9 faculty members and 107 undergraduates showed that those who sorted well tended to have higher Graduate Record Examination General Test scores and college grades than those who sorted less proficiently. Examinees generally preferred this task to multiple-choice items and felt that the task was a fairer measure of their ability to succeed in graduate school. For study 2, the task involved rating the similarity of item pairs. Five mathematics test developers and 35 undergraduate students participated, with the results analyzed by individual differences multidimensional scaling. Experts produced more scaleable ratings overall and primarily attended to two dimensions. Students used the same two dimensions, with the addition of a third. Students who rated more like experts tended to have higher admissions test scores. Examinees preferred multiple-choice questions to the rating task and felt them to be fairer. This research helps identify a new task type for admissions tests and instructional assessment. Appendixes contain sorting task directions, the study questionnaire, and the similarity rating task directions. (Contains 4 figures, 15 tables, and 15 references.) (SLD)
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Graduate Record Examinations Board, Princeton, NJ.
Authoring Institution: Educational Testing Service, Princeton, NJ.
Identifiers - Assessments and Surveys: Graduate Record Examinations