NotesFAQContact Us
Collection
Advanced
Search Tips
Peer reviewed Peer reviewed
PDF on ERIC Download full text
ERIC Number: EJ930249
Record Type: Journal
Publication Date: 2011
Pages: 14
Abstractor: As Provided
Reference Count: 22
ISBN: N/A
ISSN: ISSN-1479-4403
An Automated Individual Feedback and Marking System: An Empirical Study
Barker, Trevor
Electronic Journal of e-Learning, v9 n1 p1-14 2011
The recent National Students Survey showed that feedback to students was an ongoing problem in Higher Education. This paper reports on the extension of our past research into the provision of automated feedback for objective testing. In the research presented here, the system has been further developed for marking practical and essay questions and providing automated feedback. Recent research at the University of Hertfordshire was able to show that learners and tutors accept and value our automated feedback approach based on objective tests and Computer Adaptive Testing. The research reported in this paper is an important extension to this work. The automated feedback system developed for objective testing has been extended to include practical testing and essay type questions. The automated feedback system, which can be used within any subject area, is based on a simple marking scheme created by the subject tutor as a text file according to a simple template. Marks for each option and a set of feedback statements are held within a database on a computer. As marks are awarded for each question by the teacher an individual feedback file is created automatically for each learner. Teachers may also add and modify comments to each learner and save additional feedback to the database for later use. Each individual feedback file was emailed automatically to learners. The development of the system is explained in the paper and testing and evaluation with 350 first year (1 final practical test), 120 second year (1 written and 1 practical tests) and 100 final year (1 final practical test) undergraduate Computer Science students is reported. It was found that the time to mark practical and essay type tests was reduced by more than 30% in all cases compared to previous years. More importantly it was possible to provide good quality individual feedback to learners rapidly. Feedback was delivered to all within three weeks of the test submission date. In end of module tests it was very beneficial indeed as it had proven difficult to provide feedback in the past after modules had ended. Examples of the feedback provided are presented in the paper and the development of the system using a user-centred approach based on student and staff evaluation is explained. The comments of staff teaching on these modules and a sample of students who took part in this series of evaluations of the system are presented. The results of these evaluations were very positive and are reported in the paper, showing the changes that were made to the system at each iteration of the development cycle. The provision of fast effective feedback is vital and this system was found to be an important addition to the tools available. (Contains 8 tables and 4 figures.)
Academic Conferences Limited. Curtis Farm, Kidmore End, Nr Reading, RG4 9AY, UK. Tel: +44-1189-724148; Fax: +44-1189-724691; e-mail: info@academic-conferences.org; Web site: http://academic-conferences.org/ejournals.htm
Publication Type: Journal Articles; Reports - Evaluative
Education Level: Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: United Kingdom