ABSTRACT

Background

The Accreditation Council for Graduate Medical Education (ACGME) requires sponsoring institutions to demonstrate effective oversight through an annual institutional review (AIR). The ACGME only requires 3 elements to be reported, and it is up to the discretion of the designated institutional official (DIO) whether other supporting information should be included. This leads to uncertainty and inconsistency for DIOs as they decide what to report.

Objective

We surveyed DIOs in an effort to provide national data on key performance indicators and other relevant components of the AIR process.

Methods

In July 2019, we conducted a national survey of 847 DIOs. The survey had 16 questions that explored basic institutional demographics, timelines, and processes for the AIR and key performance indicators. Written answers were grouped by similar responses, and we performed descriptive statistics on all variables to assess distributions of responses. We also explored associations between variables using cross-tabulation and chi-square statistics.

Results

A total of 267 DIOs responded to the survey (32% response rate). There were 7 institutional performance measures that achieved over 50% consensus. These reviews required the majority of DIOs (62%, 167 of 267) 5 to 20 hours to complete. Less than one-third of sponsoring institutions reported diversity data. The majority of DIOs (68%, 182 of 267) felt the AIR process added substantial value.

Conclusions

This survey reports key performance measures and processes included by DIOs in the AIR. Our results show a wide range of institutional responses though consensus was achieved on 7 key performance measures.

Introduction

The Accreditation Council for Graduate Medical Education (ACGME) requires sponsoring institutions (SI) to demonstrate oversight of educational programs through an annual institutional review (AIR).1  The ACGME only identifies 3 components for the AIR: (1) the most recent institutional letter of notification (LON); (2) results of surveys of residents/fellows and core faculty members; and (3) programs' accreditation statuses and citations. While these minimal requirements serve as a useful guide, designated institutional officials (DIOs) must choose from a wide range of performance indicators to best evaluate their SI and educational programs. Without more specific guidance, this decision can be challenging, and the AIR process can become overly time consuming if too much information is included.

With the introduction of the Next Accreditation System in 2012, the ACGME transitioned from regularly scheduled program accreditation site visits to a process requiring institutional monitoring and oversight, primarily through the Graduate Medical Education Committee (GMEC).2  In turn, the GMEC demonstrates effective oversight of the SI through the AIR. Despite the pivotal role of this annual review, there is a paucity of published literature guiding DIOs on which key institutional performance measures to include in the AIR. Amedee and Piazza outlined a process at a single institution that relied on an annual evaluation and improvement review of each program by graduate medical education (GME) leadership.3  These authors produced a summative report, which was presented to the GMEC and served as the basis for the AIR. Reports from others showed a reliance on specific report cards or dashboards to improve performance, but no formal assessment of the AIR has been undertaken.46  This lack of data leaves DIOs uncertain about what to include in the AIR. Our objective was to survey a national sample of DIOs so that they understand what peer institutions are reporting in order to guide what information to include in their own AIR.

Methods

We conducted a national survey of 847 DIOs in July 2019. The list of DIOs and their contact information was obtained from an online database provided by the ACGME. The survey consisted of 16 questions, which assessed institutional size, timelines, and processes for the AIR and key performance indicators. The list of questions was developed by the authors based on prior experience completing the AIR and reviewed by a cohort of GME leaders for completeness and clarity. The survey was conducted through an anonymous web-based survey instrument (SurveyMonkey). Written answers were grouped into similar general responses, and we performed descriptive statistics on all variables to assess distributions of responses. We also explored associations between institutional variables key and variables of interest using cross-tabulation and chi-square statistics. All statistics were performed using SPSS Statistics for Windows Version 24.0 (IBM Corp, Armonk, NY). This research activity was determined to be exempt or excluded from Institutional Review Board oversight in accordance with current regulations and institutional (HCA) policy.

Results

A total of 267 of 847 DIOs responded to the survey (32% response rate), the majority of whom represented SIs with fewer than 10 programs (< 10 [60%, n = 160]; 10–25 [17%, n = 46]; 25–50 [8%, n = 22]; > 50 [15%, n = 39]). Respondents identified the following as the most important institutional performance measures (> 50% consensus): (1) Clinical Learning Environment Review (CLER; 66%, n = 176); (2) ACGME web data, site visit reports, and LONs (65%, n = 173); (3) faculty and resident scholarly activity (64%, n = 171); (4) institutional action plans and progress (63%, n = 169); (5) special program reviews (57%, n = 151); (6) in-training examination and board pass rates (54%, n = 143); and (7) faculty development activities (50%, n = 134; Figure).

Figure

Key Performance Indicators and Data Elements Deemed Essential From Survey

Figure

Key Performance Indicators and Data Elements Deemed Essential From Survey

The majority of DIOs reported that they devoted either between 5 and 10 hours (33%, n = 88) or between 10 and 20 hours (30%, n = 79) to complete the AIR. A greater number of sponsored programs was associated with more hours preparing the AIR by the DIO (x2 = 25.985, P = .011) and the GME office (x2 = 34.89, P < .001). Fewer than one-third of SIs reported diversity data; however, providing diversity data was significantly associated with the number of GME programs (x2 = 17.784, P < .001). The majority of DIOs (68%, n = 162) felt the AIR added substantial value to the SI. The Table shows DIO written comments as to whether the AIR added value or not. Comments were analyzed by the authors and categorized by similar responses.

Table

Participants Endorsing Each Category of Value by Annual Institutional Review

Participants Endorsing Each Category of Value by Annual Institutional Review
Participants Endorsing Each Category of Value by Annual Institutional Review

Discussion

In addition to the core ACGME requirements for the AIR, our survey of DIOs (provided as online supplementary data) identified 7 additional key performance indicators that achieved greater than 50% consensus. Of note, DIOs identified the CLER written report and annual assessment as the most important institutional performance measure, with a higher percentage of responses than individual program performance or resident outcomes such as board pass rates. Surprisingly, DIOs also included faculty development activities and results from special program reviews among the most important performance indicators.

In the sole peer-reviewed publication describing a single institution's AIR process and development, Amedee and Piazza discussed 8 performance measures that were reviewed for each program and presented to the GMEC.3  Board certification was the only additional key indicator that was similar to our survey results. The authors also noted that they relied on their CLER site visit and written report as a component of “excellence in accreditation.” This finding is consistent with our survey results, which identified CLER as an important institutional performance measure.

The ACGME requires individual programs to evaluate the diversity within their training programs in an effort to promote faculty and trainees who mirror the community in which they practice.7  Despite the increased attention to diversity within the GME community, only a minority (28%) of SIs included diversity data in their AIR. For those SIs who reported diversity data, DIOs relied on a variety of sources to collect the data, including self-report, GME office, human resources, Electronic Residency Application Service (ERAS), program reports, office of diversity and inclusion, or an electronic residency database. Interestingly, providing diversity data in the AIR was directly linked with the size of the SI. In fact, 54% of SIs with greater than 50 sponsored programs reported diversity data while only 21% of SIs with less than 10 sponsored programs reported diversity data. This result suggests that larger institutions may have more resources to accomplish this task, though other possibilities include geographic variability with larger institutions in more diverse metropolitan areas or mandated reporting requirements by larger private or public institutions.

Our survey of a national sample of DIOs was limited by the specific questions that were included in the survey, but there are other aspects of the AIR that may not have been adequately captured in this survey. Several survey question responses were restricted by prespecified lists or ranges, though many questions did allow for text entry. Additionally, the preponderance (60%) of respondents were from small SIs (< 10 programs). While DIO views from medium and large institutions may be underrepresented, this breakdown in institutional size is consistent with data reported by the ACGME.8  Finally, due to limitations of the web-based survey instrument, an individual DIO could potentially complete the survey more than once if the survey was accessed from a device with a different internet protocol address.

Conclusions

This survey provides meaningful insight into key performance measures and processes that DIOs deem important in completing the AIR. Our results show a wide range of responses to key performance measures and processes, though consensus was achieved on 7 key performance measures.

References

References
1. 
Accreditation Council for Graduate Medical Education.
ACGME Institutional Requirements
.
2020
.
2. 
Nasca
TJ,
Philibert
I,
Brigham
T,
Flynn
TC.
The next GME accreditation system—rationale and benefits
.
N Engl J Med
.
2012
;
366
(11)
:
1051
1056
.
3. 
Amedee
RG,
Piazza
JC.
Institutional oversight of the graduate medical education enterprise: development of an annual institutional review
.
Ochsner J
.
2016
;
16
(1)
:
85
89
.
4. 
Andrada
J,
Teo
J,
Neo
J,
Yeo
H,
Leng
LB.
Putting time in perspective: an integrated graduate medical education institutional dashboard and report card
.
J Grad Med Educ
.
2019
;
11
(4 suppl)
:
169
176
.
5. 
Rose
SH,
Long
TR.
Accreditation Council for Graduate Medical Education (ACGME) annual anesthesiology residency and fellowship program review: a ‘‘report card'' model for continuous improvement
.
BMC Med Educ
.
2010
;
10
:
13
.
6. 
Phitayakorn
R,
Levitan
N,
Shuck
JM.
Program report cards: evaluation across multiple residency programs at one institution
.
Acad Med
.
2007
;
82
(6)
:
608615
.
7. 
Accreditation Council for Graduate Medical Education.
ACGME Common Program Requirements (Residency)
.
8. 
Accreditation Council for Graduate Medical Education.
Data Resource Book Academic Year 2019-2020
.
2020
.

Author notes

Editor's Note: The online version of this article contains the survey used in the study.

Shayla Amos, BS, is a Medical Student, Mercer University School of Medicine; Jean B. Wiggins, BS, is Research Coordinator, Memorial Health University Medical Center; Eric K. Shaw, PhD, is Professor of Community Medicine, Mercer University School of Medicine; and William N. Hannah Jr, MD, FACP, is Associate Professor of Medicine and Assistant Dean, Mercer University School of Medicine, and Designated Institutional Official and Executive Director of Medical Education, Memorial Health University Medical Center.

Funding: This research was supported (in whole or in part) by HCA Healthcare and/or an HCA Healthcare affiliated entity.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

This work was previously presented at the AAMC Group on Resident Affairs and Organization of Resident Representatives Spring Meeting, April 26–28, 2020.

The views expressed in this publication represent those of the author(s) and do not necessarily represent the official views of HCA Healthcare or any of its affiliated entities.

Supplementary data