NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing all 5 results
Peer reviewed Peer reviewed
Direct linkDirect link
Kreiter, Clarence D.; Green, Joseph; Lenoch, Susan; Saiki, Takuya – Advances in Health Sciences Education, 2013
Given medical education's longstanding emphasis on assessment, it seems prudent to evaluate whether our current research and development focus on testing makes sense. Since any intervention within medical education must ultimately be evaluated based upon its impact on student learning, this report seeks to provide a quantitative accounting of…
Descriptors: Medical Education, Medical Students, Statistical Analysis, Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Kreiter, Clarence D. – Advances in Health Sciences Education, 2007
The academic performance consequences of relying solely on non-cognitive factors for selecting applicants above a GPA and MCAT threshold have not been fully considered in the literature. This commentary considers the impact of using a "threshold approach" on academic performance as assessed with the USMLE Step 1.
Descriptors: Grade Point Average, Medical Schools, Academic Achievement, Cutting Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Didier, Thomas; Kreiter, Clarence D.; Buri, Russell; Solow, Catherine – Advances in Health Sciences Education, 2006
Background: Grading standards vary widely across undergraduate institutions. If, during the medical school admissions process, GPA is considered without reference to the institution attended, it will disadvantage applicants from undergraduate institutions employing rigorous grading standards. Method: A regression-based GPA institutional equating…
Descriptors: Grade Point Average, Medical Schools, Validity, Grading
Peer reviewed Peer reviewed
Direct linkDirect link
Ferguson, Kristi J.; Kreiter, Clarence D. – Advances in Health Sciences Education, 2004
Purpose: To examine the validity of using scores from a clinical evaluation form as an assessment of clinical competence. Method: Investigators collected a longitudinal clinical skills assessment database that included scores reflecting performance on standardized patient interactions, case-based learning performance, scores on multiple-choice…
Descriptors: Generalizability Theory, Medical Students, Validity, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Kreiter, Clarence D.; Yin, Ping; Solow, Catherine; Brennan, Robert L. – Advances in Health Sciences Education, 2004
Purpose: Determining the valid and fair use of the interview for medical school admissions is contingent upon a demonstration of the reproducibility of interview scores. This study seeks to establish the generalizability of interview scores, first assessing the existing research evidence, and then analyzing data from a non-experimental independent…
Descriptors: Evidence, Generalizability Theory, Replication (Evaluation), Reliability