NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ790917
Record Type: Journal
Publication Date: 2008-May
Pages: 21
Abstractor: Author
ISBN: N/A
ISSN: ISSN-0360-1315
EISSN: N/A
Evaluating Student Response to WeBWorK, a Web-Based Homework Delivery and Grading System
Roth, Vicki; Ivanchenko, Volodymyr; Record, Nicholas
Computers & Education, v50 n4 p1462-1482 May 2008
As the use of web-based homework delivery and checking systems expands, we have greater need to evaluate how students engage with these systems and how changes in the systems influence student behavior. Reported here is the assessment of WeBWorK, an open-source web-based homework program used largely in postsecondary math and science courses, with recent extension to high school courses. WeBWorK draws from a large library of problems to generate individual assignments and provides immediate feedback about correctness of students' answers. WeBWorK allows entries in multiple choice, formula and numerical formats. The entries recorded by the system create an extensive database of student interactions with the system. In our two-part study, we (1) review 2387 student surveys about WeBWorK in three semesters across a range of mathematics courses and (2) test a model designed to classify student responses to system feedback, permitting a fine-grained analysis of the system's records at the keystroke level for a cohort of 96 Calculus I students. Based on this data, our goals are, first, to understand how students use the system and respond to its feedback; second, to analyze the impact of upgrades to the system interface on student users, and third, to evaluate the efficiency of training raters to classify student records. We find that the introduction of new features to the WeBWorK interface (improvements generated by our initial assessment) were welcomed by most students and reduced the overall number of submitted errors by half by the third year of the study. Importantly, the interface improvements reduced the number of errors that were not pedagogically useful. We also find that our error classification model, called here the Student Response Model, was a good fit for our student sample, as reflected by inter-rater reliability of three raters who coded the student records. In sum, our error classification model appears to be useful for analyzing formula entries. To make this assessment model widely available, we designed a system of exercises to train raters to classify student response files for WeBWorK and similar web-based systems.
Elsevier. 6277 Sea Harbor Drive, Orlando, FL 32887-4800. Tel: 877-839-7126; Tel: 407-345-4020; Fax: 407-363-1354; e-mail: usjcs@elsevier.com; Web site: http://www.elsevier.com
Publication Type: Journal Articles; Reports - Evaluative
Education Level: High Schools; Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A