ERIC Number: ED376214
Record Type: RIE
Publication Date: 1994-Jun
Cognitive Analysis of a Science Performance Assessment. Project 2.1 Designs for Assessing Individual and Group Problem Solving. Assessing the Validity of Existing Assessments of Problem-Solving Performance in Science: A Taxonomy of Cognitive Processes.
Baxter, Gail P.; And Others
The degree to which performance assessments meet their dual mandate to evaluate student learning and inform instructional practice is not adequately addressed through traditional concerns for reliability and validity. A possible approach is suggested for examining the cognitive activity students engage in during a performance assessment. The approach is demonstrated with the "Mystery Powders" classroom-based assessment being piloted by several large school districts. Thirty-seven fourth- and fifth- grade students were interviewed while they conducted an investigation to determine properties of various powders. Interview protocols and observations were analyzed, and high and low scorers were described. Results indicate that although performance scores and general understanding were generally low, high scorers could be distinguished on several characteristics. Results support the viability of the approach for analyzing the extent to which performance assessments measure higher-order thinking. Implications for instructional practice are considered. Seven figures and four tables are included. (Contains 7 references.) (Author/SLD)
Descriptors: Cognitive Processes, Cognitive Psychology, Educational Assessment, Educational Practices, Elementary School Students, Evaluation Methods, Grade 4, Grade 5, Intermediate Grades, Interviews, Performance Based Assessment, Pilot Projects, Student Evaluation, Test Reliability, Test Use, Test Validity, Thinking Skills
Publication Type: Reports - Research
Education Level: N/A
Sponsor: Office of Educational Research and Improvement (ED), Washington, DC.; National Science Foundation, Washington, DC.
Authoring Institution: National Center for Research on Evaluation, Standards, and Student Testing, Los Angeles, CA.; Pittsburgh Univ., PA. Learning Research and Development Center.