The objective structured clinical examination (OSCE) has substantial evidence for its validity in assessing learner performance at the “shows how” level in the Miller pyramid.1  However, program directors may perceive the OSCE as a complex, resource- and time-intensive assessment.2,3 

An OSCE is a standardized, objective assessment that focuses on clinical skills, attitudes, and problem-solving abilities across cognitive, psychomotor, and affective skills domains.1  Typically, learners interact with a simulator (eg, trained actor, mannequin) augmented with other elements of the medical workplace, ranging from a mock electronic health record or a consultant call to a team huddle. Learners move through each time-limited station and are assessed with a standardized scoring rubric,2,3  such as a checklist (analytical scoring) or a global rating scale.2,3  To maintain its validity and reliability evidence, OSCEs use standardized scoring rubrics with rater training and peer-review feedback.2,3 

Rip Out Action Items

Program directors should:

  • 1. 

    Determine whether an OSCE is the most effective approach to assess resident performance gaps.

  • 2. 

    Form an OSCE assessment team and outline roles, responsibilities, and an implementation timeline.

  • 3. 

    Seek to adapt existing OSCEs to meet needs and start with formative OSCE assessments.

  • 4. 

    Implement ongoing review to improve OSCE station reliability and validity.

Many OSCEs are used worldwide for both high-stakes summative assessment and formative learning in medical and interprofessional education.2,3  The OSCE development processes and case resources are available at little to no cost through peer-reviewed repositories, professional societies, and publications. See the table, Steps in Considering an OSCE, for an example of this decision-making process.

  • 1. 

    Is an OSCE a good return on investment for me? Ask: “What can the OSCE assess that other approaches cannot?” Consider using OSCEs when a relatively safe setting is needed to allow learners to practice and obtain feedback on difficult, complex, or “rare but critical” situations.

  • 2. 

    Utilize OSCE station repositories. Adopt or adapt cases and stations from a peer-reviewed source before you write your own; they are more likely to be reliable and valid. Good sources include MedEdPORTAL (http://www.mededportal.org), the Association of Standardized Patient Educators (http://www.aspeducators.org), or OSCE experts' YouTube videos (http://www.youtube.com/user/TheOSCEstation). If you are new to OSCEs, start with a formative, low-stakes OSCE.

  • 3. 

    Institute an OSCE team and specify roles. Form a team with content and OSCE expertise. Adapt or develop an OSCE blueprint (competencies needed versus station types) and an implementation timeline checklist. Finalize stations, including selection and training of station authors, assessors, and standardized patients, and OSCE logistics. Plan for monitoring, scoring, and trainee feedback.

  • 4. 

    Train the assessors. Train all assessors using standardized scoring rubrics to optimize reliability. Consider whether assessors can be mannequins (with data streams) or a trained actor, assessing both time and cost.

  • 1. 

    Conduct a needs analysis and a sustainability assessment. Continuously identify assessment gaps appropriate for an OSCE. Ask your Clinical Competency Committee for input, and review program annual performance evaluations. As you gain experience, consider shifting from formative to summative OSCE stations. Keep your chair, faculty, and residents engaged in the process as case developers or assessors.

  • 2. 

    Secure institutional buy-in and collaboration. Get buy-in from your designated institutional official (DIO). Collaboratively design and implement multidisciplinary OSCE stations on common competencies to be used across multiple programs and learner groups.

  • 3. 

    Enhance data analysis. Find a statistician (local medical school/college, national society) to evaluate your data, particularly if you are using the OSCE for summative (high-stakes) assessment or as an outcome measure for educational interventions, which may be shared through publication.

  • 4. 

    Solicit feedback. Invite OSCE actors, learners, and assessors to note OSCE general and station-specific issues to further clarify goals, define tasks, and improve stations. Invite faculty from other graduate medical education programs and institutions to evaluate the OSCE and offer feedback.

1
Miller
GE.
The assessment of clinical skills/competence/performance
.
Acad Med
.
1990
;
65
(
suppl 9
):
63
67
.
2
Khan
KZ,
Ramachandran
S,
Gaunt
K,
et al.
The objective structured clinical exam (OSCE): AMEE guide no. 81—part I: an historical and theoretical perspective
.
Med Teach
.
2013
;
35
(
9
):
e1437
e1446
.
3
Khan
KZ,
Gaunt
K,
Ramachandran
S,
et al.
The objective structured clinical exam (OSCE): AMEE guide no. 81—part II: organisation & administration
.
2013
;
35
(
9
):
e1447
e1463
.