NotesFAQContact Us
Collection
Advanced
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1174846
Record Type: Journal
Publication Date: 2018-May
Pages: 16
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1382-4996
EISSN: N/A
Applying Kane's Validity Framework to a Simulation Based Assessment of Clinical Competence
Tavares, Walter; Brydges, Ryan; Myre, Paul; Prpic, Jason; Turner, Linda; Yelle, Richard; Huiskamp, Maud
Advances in Health Sciences Education, v23 n2 p323-338 May 2018
Assessment of clinical competence is complex and inference based. Trustworthy and defensible assessment processes must have favourable evidence of validity, particularly where decisions are considered high stakes. We aimed to organize, collect and interpret validity evidence for a high stakes simulation based assessment strategy for certifying paramedics, using Kane's validity framework, which some report as challenging to implement. We describe our experience using the framework, identifying challenges, decisions points, interpretations and lessons learned. We considered data related to four inferences (scoring, generalization, extrapolation, implications) occurring during assessment and treated validity as a series of assumptions we must evaluate, resulting in several hypotheses and proposed analyses. We then interpreted our findings across the four inferences, judging if the evidence supported or refuted our proposed uses of the assessment data. Data evaluating "Scoring" included: (a) desirable tool characteristics, with acceptable inter-item correlations; (b) strong item-total correlations; (c) low error variance for items and raters; and (d) strong inter-rater reliability. Data evaluating "Generalizability" included: a robust sampling strategy capturing the majority of relevant medical directives, skills and national competencies, and good overall and inter-station reliability. Data evaluating "Extrapolation" included: low correlations between assessment scores by dimension and clinical errors in practice. Data evaluating "Implications" included low error rates in practice. Interpreting our findings according to Kane's framework, we suggest the evidence for scoring, generalization and implications supports use of our simulation-based paramedic assessment strategy as a certifying exam; however, the extrapolation evidence was weak, suggesting exam scores did not predict clinical error rates. Our analysis represents a worked example others can follow when using Kane's validity framework to evaluate, and iteratively develop and refine assessment strategies.
Springer. 233 Spring Street, New York, NY 10013. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-348-4505; e-mail: service-ny@springer.com; Web site: http://www.springerlink.com
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A