NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ821766
Record Type: Journal
Publication Date: 2008-Dec
Pages: 16
Abstractor: As Provided
Reference Count: 0
ISSN: ISSN-1382-4996
Undesired Variance Due to Examiner Stringency/Leniency Effect in Communication Skill Scores Assessed in OSCEs
Harasym, Peter H.; Woloschuk, Wayne; Cunning, Leslie
Advances in Health Sciences Education, v13 n5 p617-632 Dec 2008
Physician-patient communication is a clinical skill that can be learned and has a positive impact on patient satisfaction and health outcomes. A concerted effort at all medical schools is now directed at teaching and evaluating this core skill. Student communication skills are often assessed by an Objective Structure Clinical Examination (OSCE). However, it is unknown what sources of error variance are introduced into examinee communication scores by various OSCE components. This study primarily examined the effect different examiners had on the evaluation of students' communication skills assessed at the end of a family medicine clerkship rotation. The communication performance of clinical clerks from Classes 2005 and 2006 were assessed using six OSCE stations. Performance was rated at each station using the 28-item Calgary-Cambridge guide. Item Response Theory analysis using a Multifaceted Rasch model was used to partition the various sources of error variance and generate a "true" communication score where the effects of examiner, case, and items are removed. Variance and reliability of scores were as follows: communication scores (0.20 and 0.87), examiner stringency/leniency (0.86 and 0.91), case (0.03 and 0.96), and item (0.86 and 0.99), respectively. All facet scores were reliable (0.87-0.99). Examiner variance (0.86) was more than four times the examinee variance (0.20). About 11% of the clerks' outcome status shifted using "true" rather than observed/raw scores. There was large variability in examinee scores due to variation in examiner stringency/leniency behaviors that may impact pass-fail decisions. Exploring the benefits of examiner training and employing "true" scores generated using Item Response Theory analyses prior to making pass/fail decisions are recommended.
Springer. 233 Spring Street, New York, NY 10013. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-348-4505; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A