Abstract
The Accreditation Council for Graduate Medical Education (ACGME) expects programs to engage in ongoing, meaningful improvement, facilitated in part through an annual process of program assessment and improvement. The Duke University Hospital Office of Graduate Medical Education (OGME) used an institutional practice-based learning and improvement strategy to improve the annual evaluation and improvement of its programs.
The OGME implemented several strategies including the development and dissemination of a template for the report, program director and coordinator development, a reminder and tracking system, incorporation of the document into internal reviews, and use of incentives to promote program adherence.
In the first year of implementation (summer 2005), 27 programs (37%) submitted documentation of their annual program evaluation and improvement to the OGME; this increased to 100% of programs by 2009. A growing number of programs elected to use the template in lieu of written minutes. The number of citations related to required program review and improvement decreased from 12 in a single academic year to 3 over the last 5 years.
Duke University Hospital's institutional initiative to incorporate practice-based learning and improvement resulted in increased documentation, greater use of a standardized template, fewer ACGME-related citations, and enhanced consistency in preparing for ACGME site visits.
Editor's Note: The online version of this article features the most recent version of Duke University Hospital's Office of Graduate Medical Education template for their Annual Program Evaluation and Improvement Plan.
Introduction
The Accreditation Council for Graduate Medical Education (ACGME) requires programs to engage in an annual process of program evaluation and improvement.1 As outlined in Section V.C. of the Common Program Requirements, programs “must document formal, systematic evaluation of the curriculum at least annually,” by monitoring and tracking “resident performance; faculty development; graduate performance; and program quality.” Residents and faculty must have “the opportunity to evaluate the program confidentially and in writing at least annually.” When deficiencies are found, ACGME expects a “written plan of action to document initiatives to improve performance” that “should be reviewed and approved by the teaching faculty and documented in meeting minutes.”1
The Duke Office of Graduate Medical Education (OGME) believes that the practice-based learning and improvement approach can be extended to institutional sponsors of graduate medical education, and that this helps programs exceed ACGME minimum standards. Although the ACGME requires a written plan of action only when deficiencies are found, the OGME expects that high-performing programs will also benefit from a systematic opportunity to identify enhancements and commit to a specific action plan.
To date, few articles in the literature explore graduate medical education (GME) program evaluation and improvement. An Internet search resulted in 88 500 pages or sites for the search term “residency annual program improvement.” Most of these results were descriptions of the program evaluation process, and nearly all discussed evaluation at the program level, not from an institutional perspective. Prior research has described the benefits of institutional guidelines for program evaluation on program improvement. In 2006, Musick2 recommended that “[a] practical task oriented approach will assist program directors in ensuring compliance with program evaluation standards.” He identified 5 necessary steps regrading the evaluation: identify 1) its need 2) focus, 3) methods; determine 4) how and when to present documentation results; and 5) actual documentation of the results. He describes how the requirement for annual program evaluation and improvement can be met as a response to number 4 and number 5. The meeting can be the “stage” for how and when to present the documentation results, and the written action plan can be its documentation.
The University of New South Wales instituted a comprehensive, multicomponent, program-wide evaluation and improvement system.3 Although focused on undergraduate medical education, the approach evaluates 4 program components and can be adapted to GME. The 4 components addressed in the approach are curriculum and resources, staff and teaching, student experience, and student and graduate outcomes. For each measure, the study identified a few key indicators of quality. For example, student satisfaction and student perceptions of the teaching materials' quality were proxies for learning and teaching. The description of the assessment approach notes that “The key principles of the adopted approach include the views that both student and staff experiences provide valuable information; that measurement of students and graduate outcomes are needed; that an emphasis on action after evaluation is critical (closing the loop); [and] that the strategies and processes need to be continual rather than episodic….”3
Descriptions of institutional approaches to improving GME program performance include those of Heard et al,4 who used an annual resident survey fielded at the institutional level. Substandard programs were required to submit action plans to the Graduate Medical Education Committee (GMEC). In the baseline year, programs met standards for 55.2% of the items surveyed. One year later, after submission and implementation of action plans, programs met standards for 80.6% of the items. Programs sustained these improvements and were rewarded with subsequent positive accreditation outcomes. Heard et al4 concluded that an institutional approach to monitoring could improve educational quality as assessed by residents and accreditation success.
In another study,5 the Mayo Clinic in Jacksonville, Florida, developed a scorecard based on 4 broad areas: research, teaching, patient service, and a general category. This scorecard has been used since 2004 and is reported to be a helpful tool for program evaluation. The authors concluded that “The overall mean score of the GME programs increased, [suggesting] a positive trend.”
The literature suggests that individual programs benefit when institutions provide resources, tools, and data support for educational improvement. The aim of our effort was to use the annual program evaluation and improvement process required by the ACGME to derive a similar benefit for our programs. The practical, task-oriented approach of Musick2 could be extended to facilitate an inventory that would allow programs to assess which ACGME-required components were already in place, identify gaps, and prioritize enhancements.
Objective
Duke University Hospital is the sponsoring institution for more than 900 residents training in 73 ACGME-accredited programs, 1 nonaccredited combined program, and more than 50 non–ACGME-accredited programs. In the 2004–2005 academic year, Duke University Hospital discovered an opportunity for improvement when 12 programs received citations from various ACGME Residency Review Committees that pertained to inadequacies in the required annual review. This included citations for a lack of meaningful review, failure to include key stakeholders (usually at least 1 resident), and inadequate, incomplete, or nonexistent documentation that the reviews occurred.
Methods
In response, OGME developed a series of recommendations and brought them to the institution's GMEC for approval. The recommendations included:
development of a template for the written annual evaluation and improvement report;
education for program directors and coordinators;
implementation of a reminder and tracking system;
incorporation of the annual review report into the internal review process; and
incentives for complying.
The OGME developed an institutional template and required all programs to send documentation of their annual program evaluation and improvement to OGME each year; they recommended that the template should be used for documentation. The OGME suggested that programs use the template for reporting because it included a checklist of key elements, such as the individuals who needed to participate, the items that needed to be reviewed, and the aggregate data that programs should be tracking. The template prompts programs to identify 1 or more opportunities for improvement and to develop action plans.
The GMEC endorsed the recommendation and required programs to submit their documentation annually. However, GMEC opted to let programs choose whether to use the OGME template or substitute their meeting minutes as long as they included the key elements outlined in the preceding paragraph. The OGME initiated ongoing faculty and program coordinator development. In addition, staff in the OGME contacts program directors and coordinators by e-mail each summer to remind them to submit their document by the specified deadline. One or more reminders are sent and program compliance is tracked.
As an incentive, compliance with the requirement for submission of an annual program evaluation and improvement document is 1 of 4 dashboard metrics used in a “GME Dean's Report Card,” which the OGME compiles annually and provides to the Duke GME Oversight Committee. This committee makes recommendations for allocating funds to resident stipends and program director and program coordinator financial support. Release of data by department contributes to program directors submitting this documentation.
The template has been revised in response to feedback from the programs and to incorporate changes in institutional and/or common program requirements, or areas emphasized in the ACGME Program Director's Guide to the Common Program Requirements, especially Section V.C.6
Results
Use of the Template in Program Evaluation and Improvement
As a result of the enhancements to the program evaluation process described previously, the number of programs submitting a completed annual program evaluation and improvement document to OGME has increased and the number of ACGME Residency Review Committee citations in this category has declined (table).
Duke University Hospital's Implementation of an Annual Program Evaluation and Improvement Template

The figure shows that since the development of the template and the expectation that it (or comparable documentation) should be submitted annually to the OGME, compliance with the required annual review has increased and ACGME citations pertaining to program review have markedly decreased. By 2008, 68 (92%) programs submitted their written plan to the OGME, and programs have increasingly elected to use the OGME template in lieu of written minutes.
Total Number (N = 74) of Annual Program Evaluation and Improvement Plans Received by the Office of Graduate Medical Education, 2005–2009
Total Number (N = 74) of Annual Program Evaluation and Improvement Plans Received by the Office of Graduate Medical Education, 2005–2009
Additional Uses of the Template to Promote Program Evaluation and Improvement
Incorporating this Analysis into the Internal Review Process
The OGME has incorporated the program evaluation document into the midcycle internal review, and the data elements contained significantly inform and enhance the internal reviews. The review team can use the template to assess how effectively the program has engaged in self-reflection, identified actionable items, implemented them, and, most importantly, closed the loop by assessing the outcome and impact. Based on this analysis some programs have added new rotations; other programs deleted some programs or made them electives. Several programs improved their process for monitoring resident care of certain types of patients or within certain settings, such as nursing homes, and others have benchmarked care outcomes against national norms. Programs were able to assess the impact of these initiatives the subsequent year. Based on this experience, we strongly believe that data from the internal review should be followed up in the annual program evaluation to allow an assessment of progress on addressing citations or any other action items identified by the ACGME and GMEC. Our internal reviews are of high quality, are comprehensive, and commonly include several recommendations for improvement. The annual report template prompted a major change to an inpatient rotation recommended by an internal review. The annual evaluation allowed the program and institution to monitor this change and determine if it had achieved the desired outcome. A decline in standardized testing led to 2 different targeted interventions to improve medical knowledge, monitored by test scores in the subsequent year. Accordingly, a component of the OGME template requires that programs list the date and action plan items from the previous internal review and describe the progress to date.
Preparing for the ACGME Site Visit and the Program Information Form
The use of the template also helps with preparation for the ACGME accreditation site visit. It offers answers to questions in the Program Information Form (PIF) each program must complete. For example, the template specifically asks programs to document their (1) graduate board pass rates, (2) program improvement efforts based on program evaluation, and (3) program improvement efforts that are related to items in the ACGME Resident Survey.7 The OGME program review template has increased programs' awareness of the information that should be captured longitudinally and facilitates completion of the PIF. Programs that have not previously determined the board pass rates of their graduates have found it useful to begin to collect the data for the most recent 2 to 3 cohorts. The data captured in the template make it relatively easy to add to the program's database on an annual basis (depending on whether programs receive this information automatically, need to request data from their respective boards, or contact graduates directly).
The OGME template prompts programs to identify areas for improvement and to describe the progress on the improvement plan(s) from the prior year. Programs are encouraged to use resident performance and ACGME Resident Survey data to identify program enhancement opportunities. Programs should be able to capture at least 1 or 2 areas for improvement each year. When completing the PIF, programs can then select among 5 to 10 improvements (for a program with a 5-year cycle length) to present at the accreditation site visit. Without recording this information annually, it can be challenging to retrospectively recall these improvements during the preparation for the site visit.
The OGME annual request for each program's annual program evaluation and improvement document may remind the program to schedule its annual program review. Ensuring that the OGME has a copy of the review is useful when there are educational and administrative changes, program director/program coordinator turnover, or a limited paper trail. The new program director can frequently obtain a concise history of the program by obtaining copies of the past several years' documents from the OGME.
Faculty and Coordinator Development
The processes and requirements put into place by the OGME to meet the ACGME annual program evaluation and improvement requirement have also served as tools for faculty and coordinator development. The template allows faculty to “practice what we preach” by implementing a process of continually using data and feedback to identify ways to improve. After all, this is the same practice-based learning and improvement competency that residents learn. In addition, the template and program review process helps ensure that faculty and residents are involved in the program review and improvement of programs beyond the once-yearly confidential program evaluation.
Enhancing Resident Education
Finally, annual program evaluation and improvement is a key strategy in enhancing the education of residents. In soliciting confidential written resident evaluations as part of annual program evaluation and improvement, the program engages residents and is an opportune activity to clarify for them how their feedback is essential to the process of continuous quality educational improvement. Specifically, it is their “opportunity to confidentially evaluate the overall program yearly” (question 8 in the ACGME Resident Survey).7 It helps to characterize their feedback as their “opportunity to assess the program for purposes of Program Improvement” (question 15 in the ACGME Resident Survey).7 We have found that this is not always clear to residents, and it has been helpful to specifically identify this as an opportunity for their assessment.
Preparing the Designated Institutional Official's Annual Report to the Governing Body
The template may facilitate an organized approach to presenting a snapshot of individual programs in the ACGME-required annual report prepared by the designated institutional official for the governing body. It allows a standardized approach to collecting data, allowing documentation that can highlight best practices and identify opportunities for institutional and/or program improvement.
Discussion
The OGME initiated an institutional response to frequent citations for our programs by developing and implementing a comprehensive template for its written documentation of annual review and improvement. The template informs program directors and coordinators about the elements of the annual evaluation process, serves as a reminder and tracking system, and provides incentives and holds programs accountable for the reviews. In addition, the template provides an easy checklist that ensures that programs include the required components in their annual program evaluation and improvement process.
The major drawback in using the template was the early concern that it was “just one more thing” being added to the requirements for our busy programs. When it became clear that programs would retain “the choice” of how to provide OGME with the documentation (previous format or new template), most of these concerns were allayed.
Program directors were anxious about reporting data on compliance with the annual program review process as part of their chair's report card. When program directors learned that only compliance with the reporting requirement, not the “quality” of the reports, was being “judged” and that, in truth, this was “an easy A,” most directors felt reassured.
The PIF preparation was facilitated by having 3 to 5 existing program evaluation documents already prepared by the time an upcoming site visit was announced. Some program directors have reported an advantage to having the institution “prompt” them to convene their meeting(s) should it have been inadvertently overlooked. Several of the program directors who still opt for traditional minutes state that they nonetheless use the template as an outline for their meeting. Finally, when we have had the infrequent but abrupt program director turnover due to illness or death, the newly appointed program director believed there was benefit from a relatively contemporary documentation of program quality and direction.
Future plans include reviewing the content more critically to identify and disseminate best practices among our programs. We would also like to more systematically assess if institutional resources deployed for commonly identified challenges have truly addressed the need.
Conclusion
Duke University Hospital's initiative for improving its programs' process for program evaluation and improvement, and mandating either the template or meeting minutes to be submitted annually to the OGME has resulted in 100% of our programs having evidence of this activity, an increased use of a standardized template, fewer ACGME citations, and easier documentation for subsequent accreditation site visits. Requiring submission of the template or meeting minutes has allowed the OGME to collect and analyze institution-wide data that are useful for institution and/or program improvement. The OGME reviews the documents to identify best practices for dissemination, as well as program challenges. Many challenges are common, and OGME has used this as a “needs assessment” and has developed strategies to provide added support. Examples of this support include focused program director and coordinator workshops, a variety of specific assessment tools, and assistance with collecting aggregate data across multiple programs, such as postmatch and graduate surveys.
References
Author notes
Kathryn M. Andolsek, MD, MPH, is Professor, Community and Family Medicine, Duke University School of Medicine, and Associate Director, Duke Graduate Medical Education Office, Duke University Hospital; Alisa Nagler, JD, EdD, is Assistant Professor of Medical Education, Office of Graduate Medical Education, Duke University School of Medicine; and John L. Weinerth, MD, FACS, is Designated Institutional Official and Director, Graduate Medical Education, Duke University Hospital, and Professor, Department of Surgery, and Associate Dean, Graduate Medical Education, Duke University School of Medicine.
The authors would like to thank Ms. Tammy Tuck, OGME Chief of Staff, and Ms. Leslie Johnson, GMEC Specialist, for their work in implementing the internal review template.