ABSTRACT

Background

The transition from American Osteopathic Association (AOA) and Accreditation Council for Graduate Medical Education (ACGME) residency matches to a single graduate medical education accreditation system culminated in a single match in 2020. Without AOA-accredited residency programs, which were open only to osteopathic medical (DO) graduates, it is not clear how desirable DO candidates will be in the unified match. To avoid increased costs and inefficiencies from overapplying to programs, DO applicants could benefit from knowing which specialties and ACGME-accredited programs have historically trained DO graduates.

Objective

This study explores the characteristics of residency programs that report accepting DO students.

Methods

Data from the American Medical Association's Fellowship and Residency Electronic Interactive Database Access were analyzed for percentage of DO residents in each program. Descriptive statistics and a logit link generalized linear model for a gamma distribution were performed.

Results

Characteristics associated with graduate medical education programs that reported a lower percentage of DO graduates as residents were surgical subspecialties, longer training, and higher US Medical Licensing Examination Step 1 scores of their residents compared with specialty average. Characteristics associated with a higher percentage of DO graduates included interviewing more candidates for first-year positions and reporting a higher percentage of female residents.

Conclusions

Wide variation exists in the percentage of DO graduates accepted as residents among specialties and programs. This study provides valuable information about the single Match for DO graduates and their advisers and outlines education opportunities for the osteopathic profession among the specialties with low percentages of DO students as residents.

What was known and gap

With the creation of a single graduate medical education accreditation system and the use of only 1 residency Match in 2020 for allopathic and osteopathic medical (DO) graduates, it is unclear which specialties and residency programs are likely to train osteopathic graduates.

What is new

A descriptive analysis that examines which Accreditation Council for Graduate Medical Education (ACGME)–accredited programs have accepted DO students for residency training and the characteristics of these programs.

Limitations

Survey response bias may have led to the authors misestimating the proportion of programs that include DO graduates. Data from the American Medical Association's Fellowship and Residency Electronic Interactive Database Access may not include programs newly accredited by the ACGME.

Bottom line

Wide variation exists in the percentage of DO graduates accepted as residents among specialties and programs.

Introduction

As of the 2020 Match, there is now only 1 accrediting body for graduate medical education (GME), the Accreditation Council for Graduate Medical Education (ACGME), and all medical school graduates, from allopathic medical (MD) and osteopathic medical (DO) schools, will participate in a single match every March. Prior to the 2020 Match, there were 2 separate GME accreditation bodies, the American Osteopathic Association (AOA), whose GME positions were available only to students trained in colleges of osteopathic medicine, and the ACGME, whose positions were available to osteopathic and allopathic medical students from the United States and international medical graduates. These 2 accrediting agencies had separate annual matches, with the AOA Match historically occurring in February, followed by the National Resident Matching Program Main Residency Match in March.

These AOA residency programs had the choice to become accredited as an ACGME program or an ACGME program with osteopathic recognition that signifies the program is committed to teaching and assessing osteopathic principles and practices as conferred by the ACGME Osteopathic Principals Committee. AOA-accredited residency programs began applying for ACGME accreditation on July 1, 2015, and 92% have achieved it as of July 1, 2019.1  Of the 697 programs achieving ACGME accreditation, 190 have also achieved osteopathic recognition.1  The number of osteopathic graduates entering ACGME-accredited programs has nearly tripled from 2930 in 2014–2015 to 6370 in 2018–2019.1  The total number of osteopathic graduates in ACGME-accredited programs has also doubled from 10 999 in 2014–2015 to 22 069 in 2018–2019.1  During the transition from AOA accreditation to ACGME accreditation, 707 programs comprising 6905 positions have achieved some level of ACGME accreditation.1  An additional 14 programs have applied for, but have not yet received, accreditation as of February 2020. Eight programs have not yet completed their applications but may still do so, and 2 programs are unsure whether they will apply.1  At the same time, 85 programs have closed, and an additional 45 are closing, while the number of DO and MD graduates continues to rise.1  The increase in osteopathic applicants for ACGME positions and the loss of AOA-accredited positions indicate that all medical school graduates and their advisers are increasingly participating in a more competitive residency match environment.

As more osteopathic graduates enter ACGME-accredited programs, many are applying to more competitive specialties than DO graduates have traditionally applied, resulting in additional uncertainty about the prospects for DO graduates to match to their preferred specialties.24  A 2011 analysis showed that of the 14 789 DO graduates in GME, 46.2% were training in primary care specialties, including obstetrics and gynecology (OB-GYN), with 17.8% of all DO residents in a family medicine program.2  However, much of the growth in DO graduates in ACGME-accredited residency programs has occurred outside of primary care specialties, which have traditionally had more unfilled positions than non–primary care programs. For instance, the number of osteopathic graduates applying for positions in ACGME-accredited emergency medicine programs increased 34% from 2008 to 2009 alone.3  The increasing number of MD and DO graduates in the Match, the slower growth of ACGME-accredited positions, and the movement of DO graduates into more competitive specialties mean that graduating medical students must utilize as much data and information as they have available in order to make good decisions about where to apply to residency.

In recent years, the number of applications per applicant has dramatically increased. Data from the Electronic Residency Application Service show that US MD graduates applying to specialties like orthopedic surgery, urology, and neurological surgery will apply to more than 65 residency programs on average.5  From 2015 to 2019, the number of applications per applicant has risen by over 50% in emergency medicine and internal medicine, while OB-GYN and family medicine have seen an increase by more than 80%.5  Knowing whether a specialty or program has previously taken DO applicants may help guide graduates in deciding whether applying to a particular program or specialty is likely to result in an interview. This may reduce the number of applications that a program director receives, which has been identified as a problem.5  Initiatives like the Association of American Medical Colleges' Residency Explorer6  and Texas STAR7  have given graduating medical students information about the types of applicants programs tend to interview and rank, average board scores, and research experiences. However, to date, little research has been done on the characteristics of residency programs that are associated with selecting a DO applicant for GME. The only study specifically addressing this area is an analysis of the University of Arizona'family medicine residency program which indicated that a decrease in the number of allopathic applicants led to an increase in acceptance of DO graduates.8 

This study aims to examine which ACGME-accredited programs have accepted DO students for residency training and the characteristics of these programs. Discovering these characteristics and the overall presence of DO graduates in a specialty can inform future matches for osteopathic medical students.

Methods

In April 2019, the month following the last dual match year and a year before the first single match, a data set was amassed from the American Medical Association Fellowship and Residency Electronic Interactive Database Access (AMA-FREIDA) website.9  These data included all information available at the time for all residency programs in the database. According to the AMA, “Program data on FREIDA come directly from ACGME-accredited programs themselves via the GME Track/National GME Census, an annual online survey jointly conducted by the American Medical Association and the Association of American Medical Colleges.”9  Data from the survey were collected from programs several times through the end of 2018, with a final upload in February 2019.9  Data were collected using the Python version 3.7.3 (Python Software Foundation, Beaverton, OR) scripting language, and then descriptive statistics were performed at the program level, looking at the reported percentage of residents in the program who were DO graduates. Programs were grouped by specialty, and null values were assessed at this level and overall. Averages for each variable were calculated for each specialty group.

Subsequent to our descriptive analysis, a logit link generalized linear model was developed for gamma distribution of data. This model predicted the percentage of DO graduates in an ACGME-accredited residency program from all the quantitative variables in the data set reported for all programs as follows:

  1. Number of students interviewed for first-year positions.

  2. Average United States Medical Licensing Examination (USMLE) score of current residents. (Average Comprehensive Osteopathic Medical Licensing Examination [COMLEX] scores were not available for more than 70% of programs in the data set.)

  3. Whether the program was a surgical specialty.

  4. Program length.

  5. Percentage of women in the program.

While present in the data set, US government affiliation, visa sponsorship, and use of video interviews were not considered for analysis because of the difficulties in interpreting a model that includes binary variables. Additionally, these variables were viewed as unlikely to meaningfully change the analysis.

Results

An exploratory analysis revealed participation rates for the survey and average percentage of DO graduates per program (Table). Eight types of programs had no residencies respond to the National GME Census (Box), and the percentage of programs that reported having DO graduates ranged from 100% (pediatrics/physical medicine and rehabilitation and physical medicine and rehabilitation) to 0% (family medicine/preventive medicine, internal medicine/anesthesiology, internal medicine/dermatology, and internal medicine/medical genetics and genomics).

Table

Programs Reporting DO Graduates and Mean Percentages of DO Graduates Per Program

Programs Reporting DO Graduates and Mean Percentages of DO Graduates Per Program
Programs Reporting DO Graduates and Mean Percentages of DO Graduates Per Program
Box Specialties With 0% of Programs Responding to AMA-FREIDA Survey
  • Diagnostic radiology/nuclear medicine

  • Emergency medicine/anesthesiology

  • Family medicine/osteopathic neuromuscular medicine

  • Osteopathic neuromuscular medicine

  • Internal medicine/family medicine

  • Interventional radiology–integrated

  • Maternal-fetal medicine/medical genetics and genomics

  • Reproductive endocrinology and infertility/medical genetics and genomics

Abbreviations: AMA-FREIDA, American Medical Association-Fellowship and Residency Electronic Interactive Database.

Results of the unweighted generalized linear model were a statistically better fit than the intercept-only model (likelihood ratio test = 1714.37, P = .001), and all predictors were statistically significant. Programs less likely to report DO students were as follows: (1) general surgery or surgery subspecialty programs (WT = 4.292, P = .038); (2) programs accredited for longer periods of training (WT = 45.19, P < .001); and (3) programs with USMLE Step 1 scores that were higher than the average score of a postgraduate year 1 resident in that specialty (WT = 48.17, P < .001). All averages referenced in this section were calculated by the authors from FREIDA data. Interviewing more candidates per residency position than the average for the specialty (WT = 955.08, P < .001) and reporting a higher percentage of female residents than the average program in the specialty (WT = 1880.08, P < .001) were associated with reporting a higher percentage of DO residents. The specialties of otolaryngology–head and neck surgery, plastic surgery, urology, and neurological surgery resulted in the highest percentages of programs that reported having no DO graduates. The specialties with the highest percentages of DO graduates in GME were physical medicine and rehabilitation and pediatrics–physical medicine and rehabilitation, both with 100% of responding programs reporting at least 1 DO graduate, as well as family medicine, anesthesiology, and pediatrics with over 80% of responding programs reporting at least 1 DO graduate.

Data also revealed the specialties and programs most and least likely to report their data to the AMA-FREIDA program. Taking into consideration only specialties with 10 or more programs, among those least likely to report data were interventional radiology–integrated and osteopathic neuromusculoskeletal medicine, with 100% of programs not reporting data, as well as preventive medicine, medical genetics and genomics, nuclear medicine, ophthalmology, and dermatology, with over 70% of programs not reporting data. Those specialties most likely to report data were internal medicine–pediatrics, OB-GYN, pediatrics, radiology-diagnostic, internal medicine/emergency medicine, thoracic surgery–integrated, and pathology-anatomic and clinical, all with over 60% of programs reporting. These programs also had a higher percentage of programs reporting COMLEX scores (an average of 31% of programs in these specialties versus 12% of programs in other specialties.)

Discussion

From these data, DO students and their advisers should note that subspecialty surgery programs and more competitive programs and specialties do not have a history of taking DO graduates as residents. Jolly et al2  similarly concluded that most DO graduates were training in family medicine, internal medicine, pediatrics, and emergency medicine. Students can use this information to prepare themselves should they desire to apply to a more competitive specialty. For example, students applying to otolaryngology may want to consider additional specialties of interest when formulating plans for the match and their careers. Likewise, DO graduates may want to focus their efforts on the programs that have already taken DO graduates. If no historical data exist on a program in FREIDA, the authors suggest looking at programs currently conducting more interviews than average for the specialty or have more female residents because they are more likely than other programs to consider DO graduates.

The results of this analysis could serve as a road map for educational efforts that may result in more specialties or programs being open to accepting DO graduates. Similarly, these results could guide program directors who are in specialties where their colleagues have taken DO graduates, but they have not, by alerting them to the prevalence of osteopathic physicians in their specialty. The results may motivate specialties with fewer DO graduates to consider DO graduates, seeing how widely osteopathic students have permeated not just primary care but also many specialty programs.

Limitations to this study include survey response rates and the fact that many programs previously accredited by the AOA are newly accredited by the ACGME; therefore, their data may not be included in this study, because they did not have the opportunity to respond to the National GME Census in 2018. Outside of medical education, studies of nonresponse bias have generally shown that respondents report more socially acceptable behaviors (such as better health outcomes10  or higher satisfaction11). If that is the case in the responses to this survey, it would lead to the conclusion that inclusion of DO graduates is overestimated in this study. A census of osteopathic graduates and the programs in which they are training, using ACGME data, is likely to reveal less biased results.

This research provides empirical data for DO students and their advisers in determining the uptake of DO graduates by ACGME residency programs. More quantitative and qualitative research is needed to assess causes of this variation. Research to determine if geography, knowledge of the DO graduate's individual college of osteopathic medicine, utilization of the COMLEX-USA, or previous experience with DO graduates are associated with variation in the percentage of DO graduates in a program is needed.

This analysis will benefit DO students and graduates, as well as their advisers, planning for the match by serving as a road map to specialties and programs that are more likely to accept DO graduates. These findings may also be helpful to program directors in assessing whether DO graduates, successfully included in so many specialties and programs, should be a part of their cohort of trainees.

Conclusions

Inclusion of osteopathic graduates in residency programs varied widely by both specialty and program within each specialty, with physical medicine and rehabilitation and family medicine reporting the most DO graduates as residents.

References

References
1
American Association of Colleges of Osteopathic Medicine
.
Pence
L.
Single GME Update: Transition of AOA-approved Programs After June 30, 2020
. ,
2020
.
2
Jolly
P,
Lischka
T,
Sondheimer
H.
Numbers of MD and DO graduates in graduate medical education programs accredited by the Accreditation Council for Graduate Medical Education and the American Osteopathic Association
.
Acad Med
.
2015
;
90
(
7
):
970
974
. doi:.
3
Hansen
E,
Pilarski
A,
Plasner
S,
Cheaito
MA,
Epter
M,
Kazzi
A.
The osteopathic applicant
.
J Emerg Med
.
2019
;
56
(
4
):
e65
e69
. doi:.
4
National Resident Matching Program
.
Results of the 2018 NRMP Program Director Survey
. ,
2020
.
5
Association of American Medical Colleges
.
Preliminary Data (ERAS 2019)
. ,
2020
.
6
Residency Explorer
. ,
2020
.
7
UT Southerstern Medical Center
.
TexasSTAR Seeking Transparency in Application to Residency
. ,
2020
.
8
Lebensohn
P,
Campos-Outcalt
D,
Senf
J,
Pugno
PA.
Experience with an optional 4-year residency: the University of Arizona family medicine residency
.
Fam Med
.
2007
;
39
(
7
):
488
494
.
9
American Medical Association
.
FREIDA, the AMA Residency & Fellowship Database FAQs & Glossary
. ,
2020
.
10
Cheung
KL,
Ten Klooster PM, Smit C, de Vries H, Pieterse ME. The impact of non-response bias due to sampling in public health studies: a comparison of voluntary versus mandatory recruitment in a Dutch national survey on adolescent health
.
BMC Public Health
.
2017
;
17
(
1
):
276
. doi:.
11
Mazor
KM,
Clauser
BE,
Field
T,
Yood
RA,
Gurwitz
JH.
A demonstration of the impact of response bias on the results of patient satisfaction surveys
.
Health Serv Res
.
2002
;
37
(
5
):
1403
1417
. doi:.

Author notes

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.