There have been a number of important stakeholder opinions critical of the Step 2 Clinical Skills Examination (CS) in the United States Medical Licensing Examination (USMLE) licensure sequence. The Resident Program Director (RPD) Awareness survey was convened to gauge perceptions of current and potential Step 2 CS use, attitudes towards the importance of residents' clinical skills, and awareness of a medical student petition against Step 2 CS. This was a cross-sectional survey which resulted in 205 responses from a representative sampling of RPDs across various specialties, regions and program sizes. RPDs view clinical skills as very important, and perceive a lack of readiness among entering residents in communications skills and professionalism competencies. Most RPDs use Step 2 CS to screen residency applicants. There is desire for more specific information from score reports in these areas. Few of the respondents were aware of a current medical student petition against Step 2 CS. RPDs rely on a nationally standardized assessment of clinical skills as a criterion considered in applicant selection. These findings are valuable as the USMLE program continues to evolve and pursue its validity research agenda for USMLE.
A passing outcome on the USMLE Step 2 Clinical Skills (CS), a performance-based assessment using standardized patients, is required for physician licensure in the United States.1 The CS exam assesses examinees in three critical areas: Integrated Clinical Encounter (ICE), Communication and Interpersonal Skills (CIS), and Spoken English Proficiency (SEP). The ICE component consists of data-gathering skills (history-taking and physical exam), documentation of diagnostic impressions, justification of potential diagnoses (clinical reasoning), and selection of diagnostic studies. The CIS component is used to score examinees' patient-centered communication skills, while the SEP component is for the assessment of the examinees' clarity of spoken English. Examinees must pass all three components in a single attempt to pass the exam.
USMLE examinations support decisions of medical licensing authorities in the United States.2 ,3 However, Step 2 CS is also used for non-licensure decision-making purposes by other stakeholders, residency program directors among them. Nearly all medical schools accredited by the Liaison Committee on Medical Education (LCME) require that students pass Step 2 CS prior to graduation.4 More than three-quarters of residency programs use successful Step 2 CS completion as part of their selection process.5 There is increasing evidence to support the use of the exam in predicting future clinical skills performance,6 ,7 ,8 but no studies exist that document the perceptions of residency program directors (RPDs) towards the utility of Step 2 CS.
In 2016, a student-initiated petition was proposed with the stated aim of removing Step 2 CS from the licensure sequence for United States Medical Graduates (USMGs). The petition argued that the cost of the exam was high and the exam did not provide value given the relatively low number of USMGs who fail the exam. It also argued that there was a lack of data linking patient outcomes with exam performance, and that the skills evaluated in the Step 2 CS exam “can be more efficiently evaluated by individual medical schools” (http://endstep2cs.com/petition/). As the movement behind the petition was gaining momentum, leading state medical societies and the American Medical Association (AMA) supported various efforts to either eliminate Step 2 CS or transfer the process of clinical skills assessment to medical schools.9–11 After much consideration, the AMA elected to rescind its policy, titled, “Feasibility and Appropriateness of Transferring Jurisdiction over Required Clinical Skills Examinations to LCME-Accredited and COCA-Accredited Medical Schools (D-295.988),” due to “inadequate stakeholder support…and until a viable alternative can be identified.”12 More recently, the Directors of Clinical Skills Courses organization has supported the value and continuation of Step 2 CS, citing, as potential consequences of eliminating the exam, a devaluing of clinical skills in medical education; forfeiture of the national standard and generalizability; a threat to robust examination psychometrics; an increase in overall costs for examinees; and a failure to protect the public.13 Recent updates on the medical student petition against Step 2 CS have included student concerns that any subjective assessment (conceivably referring to the use of human raters) “can derail a medical student's career plans even if that student has succeeded on every prior examination and has performed well on the wards.”14
The National Board of Medical Examiners (NBME), in response to the criticisms against Step 2 CS, and to inform its USMLE predictive validity research program, commissioned the Resident Program Director Awareness Report in late 2016. This report utilized a set of email and phone surveys with RPDs across the United States. The initial aim was to inform research related to RPD perceptions of Step 2 CS current and potential utility, with a secondary goal of determining RPD awareness of the petition and evaluating the group of RPDs level of agreement with it. The results of this survey, reported here, were used to evaluate RPD attitudes about the importance of resident clinical skills and whether additional information on Step 2 CS outcomes could be useful for residencies. A better understanding of RPD perceptions about clinical skills in general and Step 2 CS in particular can reveal ways to address student concerns and help identify areas of change that could benefit all users of Step 2 CS outcomes.
THE NATIONAL BOARD OF MEDICAL EXAMINERS (NBME), IN RESPONSE TO THE CRITICISMS AGAINST STEP 2 CS, AND TO INFORM ITS USMLE PREDICTIVE VALIDITY RESEARCH PROGRAM, COMMISSIONED THE RESIDENT PROGRAM DIRECTOR AWARENESS REPORT IN LATE 2016.
NBME commissioned SuAzio Consulting (Newtown Square, PA) for survey development, identification of respondents, delivery of the double-blind survey* over email and phone, and compilation of results. A cross-sectional representative sampling of RPDs across specialties, regions and program sizes was sought via listings of RPDs in the ACGME program directory. The agreed-upon number of those to be contacted was pre-determined to ensure a 95% CI and error margin of <10% in the ultimate responses. A small number of program contacts were made initially to qualitatively calibrate the terminology and clarity of the questions. Ultimately, 205 RPDs (out of 10,677 total) participated in the survey, where this group had an average of seven years' experience in their roles and an average of 21 residents per program. The most common specialties were Family Medicine (23), Surgery (19), Pathology (17), Pediatrics (15), Emergency Medicine (13), Neurology (12), Psychiatry (11), Urology (10), and Internal Medicine (10).**
THE RESULTS OF THIS SURVEY, REPORTED HERE, WERE USED TO EVALUATE RPD ATTITUDES ABOUT THE IMPORTANCE OF RESIDENT CLINICAL SKILLS AND WHETHER ADDITIONAL INFORMATION ON STEP 2 CS OUTCOMES COULD BE USEFUL FOR RESIDENCIES.
The survey encompassed two areas: Current Practice and Media Use and Perception. Current Practice contained rating scales, checklists, and open-ended questions on four topics: the importance of clinical skills in general and for sub-elements (7 questions using a scale from 1 to 7, with 1 representing “not important” and 7 representing “most important”); perceived gaps in clinical skills readiness of residents and current measures of clinical skills other than Step 2 CS (13 questions); awareness and perception of the Step 2 CS petition (3 questions); and usage of Step 2 CS and perceived utility of additional score information (8 questions). Media Use and Perception contained 6 questions on the best usage and perceived credibility of Step 2 CS information published across various sources. The focus of this paper is the results of the Current Practice section; the Media Use and Perception responses were used to create respondent subgroups for this study. Respondents were grouped by: (1) size of residency program; (2) RPD years of experience; (3) awareness of students' initiative; and (4) professional usage of social media.
Descriptive statistics were calculated for the overall group and for subgroups, where subgroups were based on demographic information. A set of one-way analysis of variance (ANOVA) models were fit using α = .05 for subgroups for all questions that used a numerical rating scale, so that significant differences on these questions due to program size, years of experience, awareness of the survey, and media use could be identified. For each open-ended question, verbatim responses from the sample of respondents were reviewed.
Across respondents, the highest ranked skill was “ability to behave professionally” (49%, mean=6.7). Large programs rated it significantly higher (mean=6.9) than small programs (mean=6.6). For the remaining skills, 26% chose “ability to perform clinical reasoning” as the most important skill, followed by “ability to gather information from patients” (14%), “ability to communicate findings to colleagues” (7%), and “ability to communicate findings to patients” (5%).
Most RPDs (74%) reported observing clinical skills gaps in student readiness. On a scale from 1 to 7, with 7 being “very prepared,” first-year medical students were rated 2.3 on average, and fourth-year medical students were rated 4.7. RPDs who noted awareness of the Step 2 CS petition were more likely to observe gaps in residency (96%) than those who were not aware (71%). The most common types of gaps reported were in the areas of “clinical decisions/reasoning” (60%), “physical exam” (38%), “communication” (26%), and “understanding all patient types/areas of medicine” (21%).
ACROSS RESPONDENTS, THE HIGHEST RANKED SKILL WAS ‘ABILITY TO BEHAVE PROFESSIONALLY’ (49%, MEAN=6.7). LARGE PROGRAMS RATED IT SIGNIFICANTLY HIGHER (MEAN=6.9) THAN SMALL PROGRAMS (MEAN=6.6).
All programs reported using a direct measure of clinical skills other than Step 2 CS, such as direct observation (100%), multisource feedback from faculty and/or residents (94%), simulation labs (61%), OSCEs (49%), and videotaping (20%). Large programs were significantly more likely to use simulation labs, OSCEs, and videotaping than medium or small programs. More than 70% of the RPDs considered their current measurements sufficient to assess clinical skills of residents.
A small subset of RPDs (12%) indicated that they were already aware of the Step 2 CS petition. When asked to indicate a rating of agreement with the petition (from 1 being “not agree at all” to 7 being “fully agree”), 32% of those aware of the petition did not agree with it (score of 1 to 3), 25% agreed (score of 6 or 7), and the rest were in the middle of the scale. Agreement differed significantly by RPD experience — those with high numbers of years of experience were significantly more likely to agree with the petition (mean=5.7) than those with medium or low numbers of years (mean=3.6, 3.5).
ALL PROGRAMS REPORTED USING A DIRECT MEASURE OF CLINICAL SKILLS OTHER THAN STEP 2 CS, SUCH AS DIRECT OBSERVATION (100%), MULTISOURCE FEEDBACK FROM FACULTY AND/OR RESIDENTS (94%), SIMULATION LABS (61%), OSCEs (49%), AND VIDEOTAPING (20%).
The results indicated that 75% of RPDs use Step 2 CS to help screen residency candidates, but this differed by program size. Smaller programs were likely to use Step 2 CS for screening purposes but were highly unlikely (9%) to use it for other decisions or considerations for programs. Twenty-six percent of RPDs indicated interest in having additional Step 2 CS subscores available to use in residency screening, such as Clinical Reasoning, Professionalism, and Physical Exam. Those aware of the petition were significantly more likely (48%) to express an interest in additional subscores than those who were not aware (23%).
Samples of verbatim responses to open-ended questions across the four content areas are shown in Table 1.
A total of 205 RPDs were surveyed for insight into attitudes towards clinical skills and the USMLE Step 2 CS examination. Some themes gleaned from the survey results were that RPDs: (1) viewed clinical skills as very important, especially “ability to behave professionally”; (2) perceived a relative lack of readiness of entering residents in communications skills/professionalism and in the ability to translate skills and knowledge into practice; (3) utilized Step 2 CS to screen residency applicants; (4) showed awareness of the Step 2 CS petition (this was sometimes a factor in other survey responses, although only a small number of RPDs were aware of the petition); and (5) expressed a desire for more specific or detailed information from exam reports.
This work adds to existing research about the importance of clinical skills to RPDs. New intern communication skills are highly valued and RPDs believe trainees must be competent by August of their intern year in tasks such as patient education and communication, breaking bad news, history taking, and physical exam skills.15,16 In 2014, the Association of American Medical Colleges (AAMC) published the set of Core Entrustable Professional Activities (EPAs) for Entering Residency that further delineates the need for these competencies.17 There is also precedent that interpersonal and communication skills are among the characteristics most desired when interviewing and ranking residency candidates.18
Although RPDs reported being satisfied with the available tools for evaluating interpersonal and communication skills, more than 60% listed “difficulty comparing information across different medical schools” and “lack of reliable information about personal characteristics” in their top three “pain points” in the residency selection process.
In these study results, the importance of “behaving professionally” was identified as the most important clinical skill by RPDs, followed by the ability to “perform clinical reasoning tasks,” “gather information from patients,” and “communicate findings to patients.” All of these are currently assessed in Step 2 CS, although subscore performance in these areas is not currently provided.
The most important reported clinical skills gaps related to clinical decision making/clinical reasoning, ability to perform physical exam, and communication skills. These skills align with the AAMC's EPAs, as well as with the ICE and CIS components of USMLE Step 2 CS. In the qualitative responses, RPDs expressed concerns that students had more “book smarts” than full understanding of knowledge domains and cited various challenges with communication, including professionalism. Communication challenges existed when factoring in multicultural, international backgrounds and perspectives. Respondents stated there is no current way to effectively measure these facets, which represents a potential opportunity for innovation in assessment on the part of the USMLE program.
IN THE QUALITATIVE RESPONSES, RPDs EXPRESSED CONCERNS THAT STUDENTS HAD MORE ‘BOOK SMARTS’ THAN FULL UNDERSTANDING OF KNOWLEDGE DOMAINS AND CITED VARIOUS CHALLENGES WITH COMMUNICATION, INCLUDING PROFESSIONALISM.
Currently, 75% of RPDs use Step 2 CS results to help screen applicants. However, RPDs rarely come across students who have failed the exam, which limits the utility of the results in this context. More than 70% of RPDs found the additional measurements of clinical skills within their residency programs (e.g., direct observation, multisource feedback, simulation labs, OSCEs) sufficient to assess clinical skills of their residents, but there was interest in additional Step 2 CS score information, especially for communication and interpersonal skills. Although RPDs reported that they tend not to use the Step 2 CS outcomes for reasons beyond initial recruitment decisions, if subscores were provided, it is conceivable these would allow RPDs to provide formative feedback on deficiencies.
Despite low awareness of the student petition (12%), of those RPDs who were aware, only one-third stated they were not in agreement, which seems counterintuitive given the high value survey respondents placed on clinical skills. RPDs who agree with the petition may not find the Step 2 CS assessment of these skills useful if they do not receive useful reports of student performance. Additionally, RPDs may lack confidence in the assessment because of the low failure rate of U.S. graduates. This suggests areas for outreach and information exchange within the medical education community.
THE MAJORITY OF RPDs USE STEP 2 CS AS AN IMPORTANT FACTOR IN RECRUITMENT DECISIONS; SOME DESIRE MORE SPECIFIC INFORMATION ABOUT EXAMINEES' SPECIFIC CLINICAL SKILLS.
There are limitations in these data. First, the number of programs surveyed, compared to the total number of U.S. residency programs, was relatively small, and the results should be interpreted accordingly. While the distribution of specialties within the respondents of the survey tracked fairly closely with the percentages of allopathic residency programs across the country, it was not perfectly aligned; for example, the views of Internal Medicine RPDs could be under-represented as compared to those of Pathology and Neurology RPDs. Data analyses were not conducted at the specialty level, and it is possible that a subset of specialty programs may hold different views about the value of specific clinical skills or the utility of Step 2 CS.
Residency Program Directors view clinical skills as important in medical education and noted deficiencies in incoming residents. The majority of RPDs use Step 2 CS as an important factor in recruitment decisions; some desire more specific information about examinees' specific clinical skills. Despite valuing the clinical skills assessed by the Step 2 CS examination, a relatively low number of RPDs aware of the Step 2 CS petition definitively disagree with it.
It may be useful to evaluate the perspectives of other stakeholders, such as state medical board members. The desire for additional Step 2 CS score information may merit additional analysis, both for residency programs and for examinees. There may be value in replicating these results with a larger sample to investigate differences at a more granular level (e.g., between specialties). It may also be of interest to do a complementary study with RPDs to assess the utility of additional specific sources of information, such as the MSPEs, to gain knowledge of the types of information that are or could be used to assess readiness in the area of clinical skills.
Acknowledgment and Disclosures
The authors would like to acknowledge the assistance of Andrea Ciccone, JD. The authors are employed by the National Board of Medical Examiners (Miguel Paniagua, Kimberly Swygert, Michael Barone) and the Educational Commission on Foreign Medical Graduates (Jessica Salt). The authors have no other potential or actual conflicts of interest to report. Ethical approval was not sought and not applicable to this marketing research data.
All survey participants did so anonymously and voluntarily.
* The respondents were not aware that the NBME commissioned the survey.
** Other specialties included anesthesiology, critical care medicine, dermatology, gastroenterology, obstetrics, occupational medicine, radiation oncology, and radiology.
About the Authors
Miguel Paniagua, MD, is Medical Advisor, Test Development Services at the National Board of Medical Examiners, and Adjunct Associate Professor, The Perelman School of Medicine, University of Pennsylvania.
Jessica Salt, MD, MBE, is Assistant Vice President, Clinical Skills Evaluation Collaboration (CSEC) at the Educational Commission for Foreign Medical Graduates.
Kimberly Swygert, PhD, is Director of Research and Development, Test Development Services at the National Board of Medical Examiners.
Michael A. Barone, MD, MPH, is Vice President, Licensure Programs at the National Board of Medical Examiners.