The Accreditation Council for Graduate Medical Education (ACGME) requires programs to engage annually in program evaluation and improvement.Background

 We assessed the value of creating educational competency committees (ECCs) that use successful elements of 2 established processes—institutional special reviews and institutional oversight of annual program evaluations.Objective

 The ECCs used a template to review programs' annual program evaluations. Results were aggregated into an institutional dashboard. We calculated the costs, sensitivity, specificity, and predictive value by comparing programs required to have a special review with those that had ACGME citations, requests for a progress report, or a data-prompted site visit. We assessed the value for professional development through a participant survey.Methods

 Thirty-two ECCs involving more than 100 individuals reviewed 237 annual program evaluations over a 3-year period. The ECCs required less time than internal reviews. The ECCs rated 2 to 8 programs (2.4%–9.8%) as “noncompliant.” One to 13 programs (1.2%–14.6%) had opportunities for improvement identified. Institutional improvements were recognized using the dashboard. Zero to 13 programs (0%–16%) were required to have special reviews. The sensitivity of the decision to have a special review was 83% to 100%; specificity was 89% to 93%; and negative predictive value was 99% to 100%. The total cost was $280 per program. Of the ECC members, 86% to 95% reported their participation enhanced their professional development, and 60% to 95% believed the ECC benefited their program.Results

 Educational competency committees facilitated the identification of institution-wide needs, highlighted innovation and best practices, and enhanced professional development. The cost, sensitivity, specificity, and predictive value indicated good value.Conclusions

What was known and gap

The Accreditation Council for Graduate Medical Education requires programs to engage annually in program evaluation and improvement; the requirements for institutional oversight of this process are still evolving.

What is new

Use of educational competency committees for institutional oversight that adapts elements of the internal review.

Limitations

Single institution study; survey instrument lacks established validity evidence.

Bottom line

Educational competency committees offer good value by helping identify underperforming programs and institution-wide improvement needs and enhancing professional development.

The Accreditation Council for Graduate Medical Education (ACGME) requires programs to engage in annual program evaluation and improvement by a designated team1  and in some cases, stipulates follow-up.2  While there are no requirements for sponsoring institutions to review the annual program evaluations, several programs38  and institutions911  reported benefits from internal reviews, which ACGME previously required for institutional oversight. Beginning in 2012, Duke University graduate medical education (GME) leadership revised an annual program evaluation template that it required programs to complete12  and adapted beneficial elements from its prior internal review process.13  We reconceptualized our previous internal review teams as educational competency committees (ECCs), which we modeled after clinical competency committees (CCCs). The ECCs are multi-person teams charged to judge program performance, just as CCCs regularly review and judge residents' performance.

This study seeks to answer 2 primary questions: (1) Does this new process add value? (2) Are the costs of the process acceptable? We believed 1 important metric of success was to preemptively identify programs before ACGME raised concerns, regardless of whether these would result in citations, require progress reports, or lead to data-prompted site visits. The concerns result in measureable additional time and effort for programs and institutions.

Duke University Hospital is the sponsoring institution for more than 80 ACGME-accredited and 70 non–ACGME accredited programs. Beginning in 2005, Duke's GME Committee required programs to submit their annual program evaluations for review by the section head of its Program Oversight Committee, who reports to the designated institutional official. In May 2012, in preparation for the next accreditation system, we streamlined our template to focus on the 5 required components of the annual program evaluation. We used the GME management system (MedHub) for submission and review. In 2014, ACGME resident and faculty surveys and institutional duty hour data were provided to ECCs (table 1).

table 1

Items Used by Educational Competency Committees in Their Reviews

Items Used by Educational Competency Committees in Their Reviews
Items Used by Educational Competency Committees in Their Reviews

Educational competency committees used the color codes from the hospital's clinical scorecards to judge each component: not compliant (red), minimum compliance (yellow), substantial compliance (green), and exceeds compliance (blue). The ECC composition mimics our previous internal review team structure and includes a program director, a program coordinator, a resident, and a member of Duke's Office of GME. That position rotates among a senior administrator, a doctoral trained educator, and a prior program director.

Like CCCs, ECCs utilize the wisdom of a team to reach consensus on performance. Each ECC member received 5 to 9 annual program evaluations electronically to review in advance of the meeting, using a checklist of performance criteria. At the meeting, the ECC developed a consensus that was entered directly into the GME management system. The Program Oversight Committee Section Head reviews the results, applies the special review (SR) criteria, and identifies programs that were required to have an SR. Programs received results through the GME management system. Deidentified annual program evaluation summaries were aggregated into an institutional dashboard and trended by year (provided as online supplemental material). Vertical rows indicate successes and opportunities for program improvement and horizontal rows indicate successes and opportunities for institutional improvement.

Educational competency committee members were surveyed annually, and their feedback was used to improve the subsequent year's process. The survey asked questions, such as the amount of time spent on pre-review and whether the experience enhanced the participant's professional development and benefited their own program. The survey questions were developed by the authors (K.M.A., A.N.) without any testing.

The study was determined exempt by the Duke Medicine Institutional Review Board.

Calculating Sensitivity, Specificity, and Negative Predictive Value of Decision to Hold a Special Review

We considered the decision to require an SR as a “screening test.” We analyzed if those programs that underwent SRs were more likely to have ACGME citations, required progress reports, or data-prompted site visits. We constructed 2 × 2 tables of SR + (SR required) or SR – (SR not required), and then correlated these with ACGME accreditation decisions (table 2).

table 2

Results of Annual Program Evaluation Reviews

Results of Annual Program Evaluation Reviews
Results of Annual Program Evaluation Reviews

Cost of the Review

We calculated review costs using the educational relative value unit assigned by our medical school for the educational effort of faculty and administrative effort by coordinators. The section head received a small stipend for overseeing this process, as previously provided for overseeing internal reviews. Resident time was calculated by using the reimbursement model for internal moonlighting.

Table 2 summarizes our experience from 2012 through 2014, the 3-year period of the next accreditation system's implementation. A total of 237 annual program evaluations were reviewed over this period, and more than 100 program directors, coordinators, and residents participated. The time ECC members utilized to review materials before the meeting increased from 3 to 5 hours, while the meeting time decreased from 4 to 3 hours.

Educational competency committee members reported via the annual survey that their participation enhanced their professional development (95% in 2012, 86% in 2013, and 94% in 2014); and the majority reported that participation benefited their program (95% in 2012, 60% in 2013, and 88% in 2014).

We created specific thresholds for noncompliance, for example, a difference in 0.1 on the resident or faculty survey, board performance below the national rate, or failure to evaluate progress on the prior year's action plan. Based on their annual program evaluation, 6 programs (8%) in 2012, 2 (2%) in 2013, and 8 (10%) in 2014 received an overall rating of noncompliant. In 2012, we did not conduct special reviews as criteria were in development. A total of 8 SRs (10%) were conducted in 2013, and 13 (16%) in 2014. Thirteen programs (17%) were judged to have exceeded compliance in 2012, compared with 12 in 2013 (15%) and 1 (1%) in 2014.

In 2013, the total cost of the review was calculated at $23,081.33 ($285 per review) and $22,624.90 ($276 per review) in 2014 (table 3), or approximately $280 per program over the 2-year period. The only additional funding required was $95 per program to support resident participation.

table 3

Costs of Annual Program Evaluation Reviewsa

Costs of Annual Program Evaluation Reviewsa
Costs of Annual Program Evaluation Reviewsa

In 2013 and 2014, the sensitivity of the special review was 50% and 83%, the specificity was 92% and 89%, and the negative predictive value was 97% and 99% (table 4). The aggregated scorecard from 2012–2014 is provided as online supplemental material.

table 4

Sensitivity, Specificity, and Negative Predictive Value of Annual Program Evaluation Reviews

Sensitivity, Specificity, and Negative Predictive Value of Annual Program Evaluation Reviews
Sensitivity, Specificity, and Negative Predictive Value of Annual Program Evaluation Reviews

We successfully modified the institution's prior internal review and oversight of the annual program evaluation processes to create ECCs. Educational competency committee members perceived value from participating and predicted future benefit to their programs. The activity itself served as professional development. One participant noted that “it provided . . . an opportunity to see behind the curtain.”14 

We believed we captured the essential components of ACGME's published program evaluation template.15  Although there is no validity evidence for our ECC assessments, use of 9 to 12 assessments are known to increase validity in evaluating residents' performance.16  Our ECCs use at least 9 different assessments to evaluate program performance.

The ACGME has posted a resource for aggregating and monitoring action items from successive annual program evaluations.15  Our color-coded scorecards were valuable and familiar in our setting, as they are used widely for our health system's performance. They facilitate rapid visual identification of program and institutional successes, opportunities for improvements, and trends.

Experts in medical education suggest suboptimally performing residents fail to accurately assess their own performance, and their inability in accurately perceiving their performance impedes their ability to improve it.17  This also seemed true of our programs. Programs that reflected conscientiously on their performance submitted more robust action plans.

The number of SRs increased from 2012 through 2014, largely due to the inclusion of the resident survey and our selection of the threshold of concern as any deviation of ≥ 0.10 from the mean. We may have set the bar too high, in part because specialty-specific resident survey averages were not available. We continue to refine our calibration, minimize administrative burden on programs, and identify those that benefit from the deeper dive of an SR.

We found it reassuring that our ECCs require 1 to 2 hours less than the 10 to 12 hours required for the prior internal reviews, and that programs spend far less time in preparation. We believe we can further optimize efficiency by assigning the GME associate director to review all programs' ACGME Accreditation Data System information and duty hour documentation, and report only concerns to the ECCs.

We have attempted to determine the “educational sensitivity and specificity” of our decision to require an SR. The educational sensitivity of the SR increased after we added resident and faculty surveys to the review. The specificity remained approximately 90%, and the negative predictive value was ≥ 97%. In 2014, the ACGME conducted an early site visit on 1 program that we did not identify as needing an SR. Other programs that subsequently underwent an ACGME site visit, or were asked to submit a progress report, benefited from the preparation facilitated through the SR. The positive predictive value rose slightly but was still low. We required 8 programs to have SRs that subsequently had no issue identified by the ACGME, but we prefer to set our standards somewhat higher.

There are limitations to our study. Our experience reflects only a single institution. Costs will vary across institutions based on different faculty and administrative compensation models. Our survey lacked validity evidence and ECC members may not have interpreted the questions as we intended.

In the coming year we will incorporate core faculty into ECCs, individually “debrief” ECC findings with all program directors as a mentoring tool, and add questions about program activities related to the Clinical Learning Environment Review to our template.

We merged elements of our prior internal review and annual program evaluation processes to create ECCs, which provided peer-review professional development and identification of innovation and best practices. Sensitivity, specificity, and predictive value were acceptable.

1
Accreditation Council for Graduate Medical Education
.
Common Program Requirements
.
2015
:
12
14
.
2016
.
2
Accreditation Council for Graduate Medical Education
.
Program Requirements for Gradaute Medical Education in Pediatrics
. ,
2016
.
3
Dehghani
P,
Wood
DA,
Sharieff
W,
Basit
N,
Cheema
AN.
Interventional cardiology fellowship training in Canada: a report card using standardized criteria
.
Catheter Cardiovasc Interv
.
2011
;
78
(
2
):
179
186
.
4
Nadeau
MT,
Tysinger
JW.
The annual program review of effectiveness: a process improvement approach
.
Fam Med
.
2012
;
44
(
1
):
32
38
.
5
Reed
DA.
Nimble approaches to curriculum evaluation in graduate medical education
.
J Grad Med Educ
.
2011
;
3
(
2
):
264
266
.
6
Rose
SH,
Long
TR.
Accreditation Council for Graduate Medical Education (ACGME) annual anesthesiology residency and fellowship program review: a “report card” model for continuous improvement
.
BMC Med Educ
.
2010
;
10
:
13
.
7
Scolaro
JA,
Namdari
S,
Levin
LS.
The value of an annual educational retreat in the orthopedic residency training program
.
J Surg Educ
.
2013
;
70
(
1
):
164
167
.
8
Simpson
D,
Lypson
M.
The year is over, now what: the annual program evaluation
.
J Grad Med Educ
.
2011
;
3
(
3
):
435
437
.
9
Murray
PM,
Valdivia
JH,
Berquist
MR.
A metric to evaluate the comparative performance of an institution's graduate medical education program
.
Acad Med
.
2009
;
84
(
2
):
212
219
.
10
Heard
JK,
O'Sullivan
P,
Smith
CE,
Harper
RA,
Schexnayder
SM.
An institutional system to monitor and improve the quality of residency education
.
Acad Med
.
2004
;
79
(
9
):
858
864
.
11
Afrin
LB,
Arana
GW,
Medio
FJ,
Ybarra
AF,
Clarke
HS
Jr.
Improving oversight of the graduate medical education enterprise: one institution's strategies and tools
.
Acad Med
.
2006
;
81
(
5
):
419
425
.
12
Andolsek
KM,
Nagler
A,
Weinerth
JL.
Use of an institutional template for annual program evaluation and improvement: benefits for program participation and performance
.
J Grad Med Educ
.
2010
;
2
(
2
):
160
164
.
13
Andolsek
KM,
Nagler
A,
Dodd
L,
Weinerth
JL.
Internal reviews benefit programs of the review team members and the program under review
.
J Grad Med Educ
.
2010
;
2
(
4
):
604
609
.
14
Personal communication via the computer
.
Durham, NC
:
Duke University Hospital
;
2014
.
15
Accreditation Council for Graduate Medical Education
.
Eight steps for conducting the ACGME program self-study
. ,
2016
.
16
Williams
RG,
Verhulst
S,
Colliver
JA,
Dunnington
GL.
Assuring the reliability of resident performance appraisals: more items or more observations?
Surgery
.
2005
;
137
(
2
):
141
147
.
17
Hodges
B,
Regehr
G,
Martin
D.
Difficulties in recognizing one's own incompetence: novice physicians who are unskilled and unaware of it
.
Acad Med
.
2001
;
76
(
suppl 10
):
87
89
.

Author notes

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

Editor's Note: The online version of this article includes an institutional dashboard of deidentified annual program evaluation summaries.

Supplementary data