Abstract
Some have commented that the limited number of underrepresented minorities (URMs) in United States' residency programs is due to a lack of qualified candidates. At the University of Michigan, an objective structured clinical examination is administered to incoming residents at the beginning of training to determine baseline competence. In this study we wanted to determine if competence differed for underrepresented minorities when compared to non-URM residents.
The postgraduate orientation assessment, a 10-station examination, was developed that focused specifically on the knowledge and skills needed in the first 6 to 18 weeks of training. Stations assessed competence in informed consent, aseptic technique, evidence-based medicine, diagnostic images, critical laboratory values, cross-cultural communication, and Joint Commission requirements such as surgical fire safety, pain assessment, and management. We used various assessment measures including standardized patients, computer-based testing, and multiple-choice questions.
Our study found no significant differences in overall mean scores between URM residents and all other residents for the 5 years during which we administered the examination, except for 2002. This stands in contrast to the consistently worse performances of URM students on USMLE Step 1 and Step 2 Clinical Knowledge. Also, URM residents did not perform better or worse than their non-URM colleagues on standardized patient stations during the course of 5 years during which the examination was administered.
The postgraduate orientation assessment provides residency program directors with a standard format to measure initial clinical skills. When compared to incoming non-URM residents from a variety of medical schools, URM residents perform as well as other trainees. Our results may aid in the recruitment efforts of URM medical students into academic residency programs such as those at the University of Michigan.
Introduction
Underrepresented minority (URM) students (before 2003 this group included African American, Mexican American, Native American, and mainland Puerto Rican students) encounter numerous barriers as they travel the long road to becoming a physician.1 Barriers for URM students begin to emerge before entry into medical school and include lower performance on standardized tests (eg, ACT, SAT, and MCAT) when compared to that of their Non-Hispanic White and Asian American counterparts.2–4 Seemingly, these performance and outcome trends persist throughout medical training on measures such as basic science examinations and United States Medical Licensing Examination (USMLE) Step 1 and Step 2.4–6 As students advance to the resident application phase, more subjective measures (eg, letters of recommendation, personal statements, and interviews) are considered as well and, to a small degree, may counterbalance the lower academic credentials of some URM applicants.7,8 Previous studies4,9,10 have demonstrated the value of standardized tests in predicting future success, specifically performance on board certification examinations. These findings are often used to support the consideration of standardized tests when making critical decisions for medical school admissions, residency program placement, and subsequent practice locations or faculty appointments.
As standardized measures typically assess knowledge-based competencies, almost all medical schools have also incorporated objective structured clinical examinations (OSCEs) as a means of assessing students' clinical and communication skills.11–13 Likewise, in our attempt to capture baseline clinical competencies, we launched a 10-station postgraduate orientation assessment (POA) OSCE.13,14 This is administered during orientation; interns are informed of the assessment, but they are not provided with information on what is included. The POA is a formative assessment that focuses on knowledge and skills that residents would likely be required to use during the first 6 to 18 weeks of their residency and in situations without supervision. This content is different from that offered by USMLE Step 2 Clinical Skills. During the past 5 years of administering the POA, we have been able to assess initial proficiency levels on the 6 key competency domains (ie, medical knowledge, patient care, professionalism, communication skills, practice-based learning and improvement, and systems-based practice) mandated by the Accreditation Council for Graduate Medical Education.15 The overall POA score and performance on several stations (aseptic technique, informed consent, and system compliance/surgical fire safety) have been shown to be of moderate predictive value in performance on the percentage correct score on board certification score performance among University of Michigan pediatric residency graduates (M. L. Lypson, MD, J.A. Purkiss, PhD, H. M. Haftel, MD, MHPE, unpublished data, 2009).
Standardized tests place countless URM students at a disadvantage when they apply to medical school and may impede many from entering the gateway to a career in medicine. Given the recent challenges in race-conscious versus race-blind admissions, the ability of programs to maintain the minority physician pipeline has become increasingly more difficult. In this study, we sought to determine if in fact there were differences in performance between our underrepresented minority and non-URM residents on standardized measures when compared to an OSCE. Moreover, since the state of Michigan has recently legislated that our institution can no longer consider race as a factor in admissions decisions, we endeavored to perform a retrospective analysis of the initial performance of our incoming interns. To that end, we also sought to determine if past affirmative action recruitment efforts at the University of Michigan had resulted in less capable physicians as measured by our initial intern examination.16
Methods
The POA was developed by the Graduate Medical Education Committee at the University of Michigan (U of M) in 2002 and focuses on providing assessment and education on the knowledge and skills needed in the unsupervised settings of the first 6 to 18 weeks of training. The examination includes the following stations: informed consent, aseptic technique, evidence-based medicine, images (x-rays), critical laboratory values, cross-cultural communication, geriatric functional assessment/pediatric-proxy history taking (specialty specific), and many Joint Commission requirements, such as surgical fire safety, pain assessment, and management. These stations were designed to cover a wide range of technical, clinical, and communication skills that closely parallel early postgraduate experiences.17
Interns receive a mean score (percentage correct) for each POA station. Stations involving a standardized patient (geriatric functional assessment, pediatric proxy history taking, informed consent–blood transfusion, aseptic technique, cross-cultural communication) are scored on the basis of the standardized patient's assessment of performance, using a 3-scale scoring system (done, not done, needs improvement). Computerized stations (reading of radiographic images, evidence-based medicine, critical values, pain assessment, and surgical fire safety) use either multiple-choice or structured written response formats. The individual station scores are then averaged to derive the POA overall score. Residency program directors receive means and standard deviations of scores for all administered years in addition to those of the current class cohort. These reports provide them with formative feedback regarding the performance of individual interns and allow them to make within- and across-department comparisons.
The primary data used in this study were a compilation of USMLE Step 1, Step 2 Clinical Knowledge and POA scores from each of the 10 stations. Comparisons were made only in cases in which there had been no change on the assessment, or in cases in which the change was not content-related but simply cosmetic (eg, ordering of questions, test distribution). Information on the type of medical school attended (private, public, or international) was also collected as we anticipated this additional information would provide some insight on resident performance.
Results
Our data set comprised 736 first-year residents in 14 different specialties, representing all new residents who entered our residency programs during 2002–2006. These 736 residents were categorized as follows by self-identification: 318 (43%) were female and 417 (57%) male, and 51 (7.2%) met the 2003 Association of American Medical College classification of underrepresented minority (URM) (table 1). All other students (eg, Non-Hispanic White, Asian American, and multiracial) were categorized as non-URMs. In comparing basic demographic characteristics of the national population of US medical school graduates to those of the University of Michigan population, our sample is slightly underrepresented in terms of women (46% nationally versus 43% in the U of M population), and URMs (14% nationally versus 7.2% U of M population). In our data set of 51 URM residents, 27 (53%) were from medical schools located in states with antiaffirmative action legislation (eg, California, Texas, Florida, Michigan, Georgia, and Washington).16 We compared those residents who went to schools in antiaffirmative action states to those who did not in terms of USMLE Step 1 and Step 2 Clinical Knowledge scores, in addition to POA overall scores; there was no difference in performance between the 2 groups.
table 2 provides a detailed overview of the USMLE Step 1 and Step 2 scores for our incoming residents for the years 2002–2006. These results indicate that there were significant differences between URM and non-URM residents in that URM residents scored significantly lower than non-URM residents for all years. At the U of M, all interns must pass USMLE Step 2 Clinical Skills before starting clinical work and thus, this pass/fail information was not used in our data set.
table 3 demonstrates that URM and non-URM interns perform similarly on individual POA stations; there was limited evidence for a difference in performance between the URM residents and the other residents. With the exception of 3 occasions in the study period (testing in 2002, 2003, 2006), URM interns performed significantly worse on pain assessment and surgical fire safety (when adjusted for the use of multiple independent t tests). Similarly, URM and non-URM interns had similar overall scores on the POA for all years except 2002 (when α is adjusted for multiple t tests, not shown).
An exception to this is in the system compliance station, where the non-URM residents performed significantly better in 1 of the 5 years of administration. When taken together, the results from all other individual stations show no clear differences in performance between the 2 groups, suggesting that the URM residents performed no differently than the non-URM residents on the POA, despite scoring significantly lower on the USMLE Steps 1 and 2.
In particular, there were no discernable differences in performance on stations involving a standardized patient. We had hypothesized that URM students would perform better on the cultural communication and informed consent stations, but this did not prove to be the case.
Discussion
Despite poorer performance on standardized tests and unequal academic credentials, our first-year URM interns performed similarly to their Non-Hispanic White and Asian American counterparts on a single examination of clinical performance. These findings support previous research that suggests that the value of the Medical College Admission Test (MCAT) and undergraduate grade point averages lies in the prediction of preclinical knowledge among medical students, and these tests may not necessarily predict successful performance in residents or practicing physicians.3,18 These findings are especially critical in light of the recent abandonment of many affirmative action policies that had historically increased the minority physician pipeline.19
The POA is an orientation-based OSCE, which may at times uncover serious deficiencies in knowledge or skills, while providing residency program directors with a reliable tool to measure initial clinical skills. Furthermore, in order to limit the impact of these deficits, all participants are provided with remediation materials immediately after the assessment. This ensures instantaneous access to the correct policies, procedures, and clinical information imperative to engaging in safe and effective medical practice, as participants will soon find themselves in very similar settings.
We speculate that lower overall scores of URM residents on the POA in 2002 may be the result of the timing of the examination. The POA was first introduced in 2002, which was 1 year before the implementation of USMLE Step 2 Clinical Skills. At this time, only a limited number of schools had a final comprehensive clinical skills assessment required for graduation.20 Although it remains unclear whether our URM residents hailed from such schools, after the Step 2 Clinical Skills was implemented URM residents' overall POA scores reached levels similar to those of the non-URM group. As previously stated, we developed the POA to address clinical issues seen in the first several weeks to months as a new physician. It is a test of practical clinical skill that may have more practical translation to actual bedside behaviors. The clinical skills assessed during the USMLE clinical skills examination and the POA may measure different aspects of clinical acumen than those assessed in traditional standardized tests (eg, Step 1 and Step 2 Clinical Knowledge). It is also possible that our URM resident selection process improved in the years subsequent to 2002. Additionally, the consequences of stereotype threat among undergraduate and medical students may in fact contribute to poorer performance on standardized tests such as Step 1 and Step 2 Clinical Knowledge. URM resident performance on the POA may not be affected by this, owing to the lack of any preconceived notions on performance for this test.21,22
We originally hypothesized that the URM residents would perform better on some of the standardized patient stations, most notably the cross-cultural communication and informed consent stations. This did not bear out, and in fact the findings support the argument that all physicians need training in the area of cultural competency and health disparities.
We must note the limitations of this study. First, the sample includes a relatively small sampling of URM residents at 1 institution; however, on average, we recruit from 60 different medical schools annually. We fully recognize that the small number of URM interns limits the power of the study in its ability to generalize to larger populations. Also, the residency programs at the University of Michigan consider themselves to be highly selective, which may place them at an advantage when selecting the most qualified URM students in the country; however, this would be true of all our residents. These factors, taken either alone or together, may also limit the generalizability of our findings. One could argue that the POA has limited a priori data regarding construct validation, which may increase the likelihood of nonsignificant group differences. Finally, the combined educational and assessment stations (eg, fire and pain assessment) could present a new modality experience for some incoming interns who typically may have had difficulty with a time-limited, traditional, multiple-choice framework. This testing format may reinforce difficulties URM students have in navigating single best-answer test items.23
Conclusion
When compared to non-URM residents from a variety of medical schools, incoming URM residents perform as well as other trainees. These results highlight the multidimensional skill set involved in obtaining sufficient medical skills (eg, knowledge, clinical skills, communication) and will hopefully encourage more schools to broaden their view of URM applicants.
References
Author notes
Monica L. Lypson, MD, is Assistant Dean for Graduate Medical Education and Associate Professor, Department of Internal Medicine, at University of Michigan Medical School; Paula T. Ross, MA, is Research Area Specialist, Department of Health Behavior and Health Education, at University of Michigan; Stanley J. Hamstra, PhD, is Acting Assistant Dean, Academy for Innovation in Medical Education, and University of Ottawa Faculty of Medicine Research Director at University of Ottawa Skills and Simulation Centre, Ottawa, Ontario, Canada; Hilary M. Haftel, MD, MHPE, is Associate Professor of Pediatrics, Internal Medicine, and Medical Education, Associate Chair for Pediatric Education, and Pediatric Residency ProgramDirector at University of Michigan Medical School; Larry D. Gruppen, PhD, is JosiahMacy Jr Professor ofMedical Education and Chair, Department of Medical Education, at University of Michigan Medical School; and Lisa M. Colletti, MD, is Associate Dean for Graduate Medical Education and C. Gardner Child Professor of Surgery at University of Michigan Medical School, Ann Arbor, Michigan.