ABSTRACT
The COVID-19 pandemic has affected every facet of American health care, including graduate medical education (GME). Prior studies show that COVID-19 resulted in reduced opportunities for elective surgeries, lower patient volumes, altered clinical rotations, increased reliance on telemedicine, and dependence on virtual didactic conferences. These studies, however, focused on individual specialties. Because the Accreditation Council for Graduate Medical Education (ACGME) routinely collects information from all programs it has an obligation to use these data to inform the profession about important trends affecting GME.
To describe how the pandemic influenced resident training across all specialty programs in areas of clinical experiences, telemedicine, and extended trainings.
The ACGME validated a questionnaire to supplement the Annual Update reporting requirements of all accredited programs. The questionnaire was tested to ensure easy interpretation of instructions, question wording, response options, and to assess respondent burden. The questionnaire was administered through the Accreditation Data System, which is a password-protected online environment for communication between the ACGME and ACGME-accredited programs.
We received a response rate of 99.6% (11 290 of 12 420). Emergency medicine, family medicine, internal medicine, and obstetrics and gynecology programs experienced the most significant impact. Most programs reported reduced opportunities for in-person didactics and ambulatory continuity rotations. Hospital-based programs on the “frontline” of COVID-19 care relied least on telemedicine. Family medicine and internal medicine programs accounted for the greatest number of extended trainings.
COVID-19 has affected GME training, but its consequences are unevenly distributed across program types and regions of the country.
Introduction
The Accreditation Council for Graduate Medical Education (ACGME) is committed to the ongoing monitoring of the state of affairs in graduate medical education (GME) training programs as part of its regulatory role and to assure the public that residents and fellows are receiving the necessary training experiences that will enable them to provide safe, effective health care.1 While quality assurance at the level of individual training programs is at the heart of the ACGME's work, its strategic plan also articulates a duty to advance a “system of graduate medical education” through continuous quality improvement of accreditation processes. To understand the systemic effects of COVID-19, the ACGME has undertaken a survey of all the GME programs it accredits to identify the type and extent of the pandemic's effects on GME on a national scale.
The COVID-19 pandemic has affected every facet of the American health care system since the disease was first detected in the United States in early 2020. The effects include financial stresses on hospitals and clinics especially in the early months of the disease, high volumes of patients in emergency departments, inpatient and intensive care units (ICUs), and changes in surgery schedules. For the medical staff it has meant demanding workloads and redeployments. For patients it has meant deferred and foregone care.2-5 Each of these effects have implications for residency training, which depends on learners participating in patient care in a variety of clinical settings under appropriate supervision. Studies have assessed the repercussions of the pandemic on GME, finding reduced opportunities for participation in elective procedures, reduced case volumes,6,7 altered clinical rotations,8 increased reliance on telemedicine, and virtual didactic lectures and conferences.9 Some studies have reported shifting floor duties and dependance on advanced care nurses to conduct hospital rounds.7,10 One multispecialty literature review encapsulated effects on surgical specialties: elective procedures were reduced, trainees and attending physicians were “redeployed,” and residents experienced fewer clinical rotations and fewer opportunities to treat trauma cases.11
Prior studies have typically focused on specific specialties. Because of its role, the ACGME routinely collects information from all programs. As stewards of these data, the ACGME has an opportunity, and an obligation, to use these data to describe how the pandemic has influenced opportunities for adequate resident training and thus inform the larger profession about trends in medical education. While the ACGME's data collection activities serve the accreditation role first and foremost, during the pandemic, the ACGME has supplemented its Annual Update reporting requirements to include a questionnaire focused on the effects of COVID-19. This has allowed the ACGME to monitor and understand the consequences of the COVID-19 pandemic for the whole of the GME community.
Methods
Questionnaire Design
In April 2020 a taskforce of experienced GME leaders on the ACGME staff drafted a questionnaire to capture supplemental information regarding the effects of the pandemic on residents and faculty. Details of that survey's design methods are described by Byrne et al.1 In early spring 2021 we modified that questionnaire to include items about the training experiences of residents and assessment practices. We also modified questions about the use of telemedicine. A draft of the 2021 questionnaire was reviewed for clarity and comprehensiveness by 3 GME physician administrators not affiliated with the ACGME. Then, a convenience sample of 2 program directors and 1 designated institutional official were recruited to participate in cognitive interviews to assess the instructions, to uncover how respondents might interpret question wording, to determine how item flow might affect responses and how participants processed response options. Adjustments to the instructions, question wording, item ordering, and response options were made based on this input.
Content
The final instrument included instructions that informed respondents that the results would provide contextual information that their review committees could consider when reviewing their annual reports. We asked programs about the extent to which COVID-19 affected clinical experiences, such as clinic visits, surgical procedures, ICU coverage, and in-patient admissions. This series also included items about educational programming, such as didactics, conference attendance, elective rotations, observation, Clinical Competency Committee, or Program Evaluation Committee meetings. Questions about these experiences were asked on a 5-point Likert-type scale with verbal anchors for “cancelled,” “significantly decreased,” “moderately decreased,” “no effect,” “moderately increased,” and “significantly increased.” Most items allowed for “not applicable.”
We also asked programs to estimate the number of months that they were “significantly” impacted by COVID-19 during the academic year (July 1, 2020 to June 30, 2021). We also asked about the peak usage of telemedicine technology during the academic year. This was measured in the proportion of patient encounters ranging from “none,” to “1%-24%,” “25%-49%,” “50%-74%,” or “75% or more.”
Administration
The questionnaire was administered in the ACGME's Accreditation Data System (ADS). The ADS is a password-protected online environment where program directors or their designees communicate with the ACGME. Each July, programs are required to submit an annual update, which summarizes important initiatives, announce changes in personnel and training sites, responds to critiques of programming, and provides other information pertinent to maintenance of accreditation. The questionnaire was completed as a supplement to the 2021 Annual Update.
Data Management
In addition to the contents of the questionnaire, the ACGME used existing administrative data to sort programs by specialty and location so that we could arrange them by US Census region given the variable nature of the pandemic surges during the 2020-2021 academic year. Using the specialty designation, we clustered programs first by frontline and then others. Questions about the extent of effects on clinical experiences and educational programming were clustered to “cancelled,” “increased,” “decreased,” or “not applicable.” For the purposes of this article, programs that responded with “not applicable” were excluded only on those items where the response was “not applicable.”
For this analysis, specialties were organized by “frontline” or “not frontline.” Frontline specialties were further organized by medical, hospital, or surgical specialty using sorting reported in the ACGME's annual Data Resource Book.12 The ACGME's staff physicians advised on which programs were most likely to be systematically affected by the pandemic. Using those expert opinions, we sorted these specialties into “frontline” and “not frontline” specialties (see online supplementary data). The only surgical program to be associated as “frontline” is surgical critical care medicine (surgical CCM).
Results
Response Rate
There were 12 420 ACGME-accredited programs in the United States when the 2021 Annual Update was completed. From that 1130 programs had no residents enrolled and were excluded from analysis. This left 11 290 eligible for analysis. Of that, 40 programs did not complete the questionnaire, resulting in a response rate of 99.6%.
Significant Impact
Table 1 indicates that 535 programs reported experiencing 7 or more months of significant impact. Those programs were concentrated in family medicine (n=92, 13.6%), internal medicine (37, 6.8%), emergency medicine (36, 14.0%), and obstetrics and gynecology (20, 7.0%). More than half (53.2%, 5983 of 11 250) of programs reported no “significant impact” during any time of the year; 29.9% (3360) reported between 1 and 3 months of significant impact; 12.2% (1372) reported 4 to 6 months; 2.7% (305) reported 7 to 9 months of significant impact; and 2% (230) reported 10 to 12 months of significant impact.
A large majority (71.8%, 94 of 131) of surgical CCM programs reported enduring no significant impact through the year. Similarly, majorities of non-frontline specialty programs (56.6%, 4824 of 8517) and hospital-based frontline programs (50.9%, 474 of 931) reported no significant impacts. In contrast, a majority of medical frontline programs (55.4%, 926 of 1671) reported significant impacts lasting up to 6 months.
In all regions but the West, more than half of programs reported experiencing no significant impact. The West was most significantly impacted. More than half (52.2%, 947 of 1815) of the programs in the Western states reported at least some period of significant impact and 7% (126 of 1815) of the Western programs endured significant impact for 7 or more months. Programs in US Territories (56.2%, 41 of 73) and the Midwest (54.7%, 1471 of 2690) reported no significant impact.
Effect on Training Opportunities
Table 2 depicts the proportion of the programs that reported a change in a particular training experience. The most significantly affected was ambulatory continuity clinic visits, followed by didactic conferences. In the other experiences, the preponderance of programs reported no change. More than half (50.6%) of specialty programs that include ambulatory continuity visits as part of their curricula reported a decline in those training opportunities. For specialties that include ICU training as part of the curriculum, most (60.3%) reported no effect, but one-third of programs (33.9%) reported an increase. Forty percent of all programs reported a decline in elective rotations, with 5% of programs reporting an outright cancelation of elective rotations. Didactic conference attendance was affected as in-person participation declined and virtual attendance increased. In other training activities, the plurality of programs reported overall no changes. However, there are some distinguishing features for certain specialties and regions of the country.
Effect on Training for Frontline Specialties
Table 3 depicts the effect of COVID-19 on learning experiences for residents and fellows and reports only those experiences where the plurality of programs reported either an increase or decrease. Those where the plurality remained unchanged are not reported. A slight majority of all programs (50.6%) reported a decline in ambulatory continuity clinic visits (Table 2); about two-thirds (68.1%) of medical frontline programs reported a decrease (Table 3). Non-frontline specialties (47.3%) and hospital frontline (48.3%) tended to report decreased opportunity for continuity clinic visits. Overall, nearly half (49.5%) of all programs reported decreases in non-continuity clinical rotations (Table 2), and two-thirds (68.3%) of medical frontline specialties reported decreases (Table 3).
As might be expected, across-the-board decreases for in-person didactic experiences were more than offset with increased virtual ones. Most programs of all types continued to provide residents with opportunities for scholarly activity. A majority (59.0%) of medical frontline specialty programs reduced elective rotations, while most other program types were able to continue offering electives. Large majorities of programs reported no change in opportunity for one-on-one advising with trainees.
Regional Effects on Training
Table 4 indicates which training experiences were affected by COVID-19 in each region of the United States. Overall, about half of the programs reported decreased ambulatory continuity clinical visits. Non-continuity clinical rotations increased in 51.7% of programs in the West and reduced in for a plurality (45.6%) of programs based in a US Territory. In-person conferences were offset by virtual conferences. Large majorities in each region reported no effect on direct observation. Pluralities of programs in the Northeast (49.7%), US Territories (47.2%), and West (49.4%) reported reduced elective rotations.
Telemedicine
Table 5 indicates that more than one-quarter (27.3%, 3074 of 11 250) of programs did not use telehealth to provide clinical care to patients. Forty percent (4501) of programs used it for up to 24% of their patient encounters, 14.1% (1586) of programs used telemedicine systems for 25% to 49% of the time, 9% (1017) used it for 50% to 74% of patient encounters, and 9.5% (1072) used it for 75% or more of patient encounters. Telemedicine seems to have been most heavily used in the non-frontline and medical frontline specialties. About 19.5% of non-frontline programs reported using telemedicine for at least half of patient encounters, and 19.6% of frontline medical programs used telemedicine for at least half of patient visits. More than half (53.6%, 499 of 931) of hospital frontline programs and 82.4% (108 of 131) of surgical CCM programs reported not using telemedicine during the pandemic's peak effect on their programs.
Extended Training
We asked programs whether residents needed to extend training due to COVID-19. While only 1.1% of non-frontline programs said they needed to extend training, 2.3% of hospital frontline and 3.4% of medical frontline programs required some extended training. In total, 171 programs reported needing to extend training, 152 of them for only one resident. Eighteen programs needed to extend training for 2 to 4 residents and one program reported needing to extend training for 9 residents. The specialties with the largest number of programs reporting extensions are internal medicine (26 programs) and family medicine (25 programs). No surgical CCM program reported extending training due to COVID-19. Almost 50 residents had training extended due to personal illness attributable to COVID-19. The lion's share of extensions is in academic medical centers (72 programs) or general teaching hospitals (65 programs).
Discussion
As with previous studies, we find that the COVID-19 pandemic has created substantial and variable training disruptions across specialty training programs during the 2020-2021 academic year. The most significant pandemic surge occurred during the winter months secondary to the Alpha variant. Consistent with other studies we find the most significant disruptions in ambulatory clinical rotations, ICU experiences, admissions, and use of telemedicine. We can add to this knowledge by observing how clusters of specialties and geographic regions are unique in their experiences.
Nearly 1 in 3 programs (29.9%) reported between 1 and 3 months of significant impact due to COVID-19, 12.2% reported 4 to 6 months, and more than half (53.2%) of programs reported no “significant impact” at any time of the year. Large majorities of frontline medical programs, which bear the largest burden in caring for COVID-19 patients, and hospital programs experienced these effects in reduced ambulatory clinical rotations and increased in-patient admissions and ICU coverage. Across the board, programs experienced an increase of virtual didactic events and conference attendance, while in-person learning was reduced.
Two-thirds of programs (67%) used telemedicine for 25% or less of their patient visits. About 10% of non-frontline and medical frontline programs used telemedicine technology for 75% or more of patient encounters. Telemedicine seems to have been most frequently used in the Western states, where 28% of programs used telemedicine for more than half of patient encounters. As this is a rapidly emerging area, in 2020, the Association of American Medical Colleges announced that telemedicine will be an entrustable professional activity that students must learn, and the American Association of Colleges of Osteopathic Medicine advised member colleges that they should prepare graduates to use this technology, despite uncertainty about the optimal learning environment for this training. Medical educators are advised to develop curricula that address the technological, ethical, and health care needs of patients.13,14 That means establishing the learner's comfort with technology, addressing proficiency, and developing examination skills.15
Conclusions
The COVID-19 pandemic has affected GME training, but its consequences are unevenly distributed across program types and regions of the country. The findings of this survey will assist in understanding the nature of the effects and help guide future efforts to improve GME in the long term and prepare for the next pandemic. While these data are not being used to determine the accreditation status of any program, the findings will help the ACGME be better able to serve the GME community, by calling to mind a greater awareness of local contexts and longer-term impacts on the professional development of residents and fellows affected by the pandemic.
References
Author notes
Editor's Note: The online version of this article contains details of the organization of programs into “frontline” or “not frontline.” The ACGME News and Views section of JGME includes data reports, updates, and perspectives from the ACGME and its Review Committees. The decision to publish the article is made by the ACGME.