ERIC Number: EJ1054440
Record Type: Journal
Publication Date: 2013-Nov
Abstractor: As Provided
Next-Generation Environments for Assessing and Promoting Complex Science Learning
Quellmalz, Edys S.; Davenport, Jodi L.; Timms, Michael J.; DeBoer, George E.; Jordan, Kevin A.; Huang, Chun-Wei; Buckley, Barbara C.
Journal of Educational Psychology, v105 n4 p1100-1114 Nov 2013
How can assessments measure complex science learning? Although traditional, multiple-choice items can effectively measure declarative knowledge such as scientific facts or definitions, they are considered less well suited for providing evidence of science inquiry practices such as making observations or designing and conducting investigations. Thus, students who perform very proficiently in "science" as measured by static, conventional tests may have strong factual knowledge but little ability to apply this knowledge to conduct meaningful investigations. As technology has advanced, interactive, simulation-based assessments have the promise of capturing information about these more complex science practice skills. In the current study, we test whether interactive assessments may be more effective than traditional, static assessments at discriminating student proficiency across 3 types of science practices: (a) identifying principles (e.g., recognizing principles), (b) using principles (e.g., applying knowledge to make predictions and generate explanations), and (c) conducting inquiry (e.g., designing experiments). We explore 3 modalities of assessment: "static," most similar to traditional items in which the system presents still images and does not respond to student actions, "active," in which the system presents dynamic portrayals, such as animations, which students can observe and review, and "interactive," in which the system depicts dynamic phenomena and responds to student actions. We use 3 analyses--a generalizability study, confirmatory factor analysis, and multidimensional item response theory--to evaluate how well each assessment modality can distinguish performance on these 3 types of science practices. The comparison of performance on static, active, and interactive items found that interactive assessments might be more effective than static assessments at discriminating student proficiencies for conducting inquiry.
Descriptors: Science Education, Educational Assessment, Psychometrics, Science Tests, Science Process Skills, Computer Assisted Testing, Computer Simulation, Scientific Concepts, Scientific Principles, Inquiry, Interaction, Multimedia Instruction, Multimedia Materials, Middle School Students, Secondary School Science, Item Response Theory, Evaluation Methods, Program Effectiveness, Factor Analysis, Multitrait Multimethod Techniques
American Psychological Association. Journals Department, 750 First Street NE, Washington, DC 20002. Tel: 800-374-2721; Tel: 202-336-5510; Fax: 202-336-5502; e-mail: firstname.lastname@example.org; Web site: http://www.apa.org
Publication Type: Journal Articles; Reports - Research
Education Level: Middle Schools; Secondary Education; Junior High Schools
Sponsor: National Science Foundation
Authoring Institution: N/A
Grant or Contract Numbers: DRL-0814776; DRL-0733345