ERIC Number: EJ834510
Record Type: Journal
Publication Date: 2009-May
Abstractor: As Provided
Reference Count: N/A
The Reliability of Workplace-Based Assessment in Postgraduate Medical Education and Training: A National Evaluation in General Practice in the United Kingdom
Murphy, Douglas J.; Bruce, David A.; Mercer, Stewart W.; Eva, Kevin W.
Advances in Health Sciences Education, v14 n2 p219-232 May 2009
To investigate the reliability and feasibility of six potential workplace-based assessment methods in general practice training: criterion audit, multi-source feedback from clinical and non-clinical colleagues, patient feedback (the CARE Measure), referral letters, significant event analysis, and video analysis of consultations. Performance of GP registrars (trainees) was evaluated with each tool to assess the reliabilities of the tools and feasibility, given raters and number of assessments needed. Participant experience of process determined by questionnaire. 171 GP registrars and their trainers, drawn from nine deaneries (representing all four countries in the UK), participated. The ability of each tool to differentiate between doctors (reliability) was assessed using generalisability theory. Decision studies were then conducted to determine the number of observations required to achieve an acceptably high reliability for "high-stakes assessment" using each instrument. Finally, descriptive statistics were used to summarise participants' ratings of their experience using these tools. Multi-source feedback from colleagues and patient feedback on consultations emerged as the two methods most likely to offer a reliable and feasible opinion of workplace performance. Reliability co-efficients of 0.8 were attainable with 41 CARE Measure patient questionnaires and six clinical and/or five non-clinical colleagues per doctor when assessed on two occasions. For the other four methods tested, 10 or more assessors were required per doctor in order to achieve a reliable assessment, making the feasibility of their use in high-stakes assessment extremely low. Participant feedback did not raise any major concerns regarding the acceptability, feasibility, or educational impact of the tools. The combination of patient and colleague views of doctors' performance, coupled with reliable competence measures, may offer a suitable evidence-base on which to monitor progress and completion of doctors' training in general practice.
Descriptors: Reliability, Graduate Medical Education, Family Practice (Medicine), Vocational Evaluation, Evaluation Methods, Feedback (Response), Graduate Students, Medical Students, Physicians, Student Evaluation, Generalizability Theory, Foreign Countries
Springer. 233 Spring Street, New York, NY 10013. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-348-4505; e-mail: firstname.lastname@example.org; Web site: http://www.springerlink.com
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Authoring Institution: N/A
Identifiers - Location: United Kingdom