NotesFAQContact Us
Search Tips
ERIC Number: ED517680
Record Type: Non-Journal
Publication Date: 2009
Pages: 202
Abstractor: As Provided
Reference Count: N/A
ISBN: ISBN-978-1-1097-6934-0
Analysis of an Informed Peer Review Matching Algorithm and Its Impact on Student Work on Model-Eliciting Activities
Verleger, Matthew Alan
ProQuest LLC, Ph.D. Dissertation, Purdue University
Model-Eliciting Activities (MEAs) are realistic, open-ended, client-driven engineering problems designed to foster students' mathematical modeling abilities. Since 2005, the MEAs used in Purdue University's first-year engineering core course have included a double-blind peer review wherein individuals in the course (peers) are randomly assigned a student team's response to an MEA to review. In 2007, a calibration exercise where by students evaluated a prototypical piece of student work and compared their review to that of an expert was added to the MEA implementation sequence in an attempt to increase the quality of feedback individuals were provided during the peer review. At that time the reviewer-reviewee assignment process remained random. The calibration exercise's value was limited only to the self-reflective knowledge a student gained from comparing their responses on the MEA Rubric to those of the expert. This research investigated the impact of informed peer review matching algorithms on the quality of team's final MEA responses. The algorithms use data from the calibration exercise and Teaching Assistant marks on the team's first draft response as measurements of the reviewers accuracy and reviewees degree of assistance needed in order to make more informed matches. Three informed assignment methods were developed and one was thoroughly investigated to determine its impact. The violation of multiple critical assumptions surrounding the assignment method resulted in no apparent differences between the selected informed assignment method and the blind random assignment method. The failure of those assumptions indicates that the existing training methods and/or the rubric are inadequate for producing sufficiently valid TA and student marks on MEAs. Details of how the assumptions were violated and what must be done to resolve them to better investigate the research question are discussed. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page:]
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site:
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Indiana