The June 2015 issue of the Journal of Graduate Medical Education introduced the self-study as a key component of a new accreditation system, emphasizing the focus on ongoing improvement that goes beyond compliance with the accreditation standards.1 The article highlighted the existing annual program evaluation as the basic building block of program improvement, with a more detailed self-study every 10 years, followed by a full accreditation site visit.
At the heart of this new approach is to have programs focus on program-level aspirational aims, which give consideration to the needs and career plans of their trainees and the patients and communities they serve.1 Programs are also asked to assess the external environment in which they operate.1 Accreditation Council for Graduate Medical Education (ACGME) data show that most programs on continued accreditation have very few or no citations,2 making an improvement approach based on citations and areas for improvement identified by the Review Committees (RC) ineffective. In contrast, having programs identify and prioritize areas for improvement through what essentially is an abbreviated, program-focused strategic planning exercise can facilitate improvement in the new system. This process is relevant to all programs on continued accreditation, including those where the self-study is not scheduled until the next decade.
The concluding step of this ongoing effort of program assessment and improvement is a more formal self-study every 10 years, with a full accreditation site visit 12 to 18 months later. This article discusses the link between the annual program evaluation, the self-study, and the 10-year accreditation site visit—how these elements collectively will promote improvement in programs and how the ACGME plans to provide feedback on this process with the intent of accelerating improvement.
Connecting the Annual Program Evaluation and the Self-Study
All programs are encouraged to begin this improvement process at the time of their next annual program evaluation, with a focus on program aims and context. This way programs with a self-study that is still years away can benefit from this new process, and will be able to demonstrate ongoing program evaluation and improvement efforts at the time of their 10-year accreditation site visit. The ideal way is to create a “track record” of improvement that will be used as a simple data set, showing action plans and improvements achieved. This information has uses well beyond the accreditation process. For example, it can be shared with applicants and residents, as well as faculty candidates, who often do not have a sense of what improvements a program has made to its learning experience or its clinical environment. Furthermore, keeping a record does not need to be onerous, and the ACGME3 and a number of sponsoring institutions have developed simple forms that programs can use to track improvements.
This information on prior improvements that have been made, along with areas that still need to be addressed, is the foundation for the discussion on strengths and areas for improvement during the program self-study. A related element of the self-study is the request for the self-study group to look back and conduct a 5-year account of changes in the program, which is paired with a request that the group look 5 years into the future, as well as answer the question, “What will take this program to the next level?” These sections of the self-study are intended to foster discussion among program and department leaders, faculty, trainees, and other pertinent stakeholders regarding what improvements would be a good fit for the program's aims and current context.
Areas for improvement identified in the self-study ideally should include both short-term and long-term objectives, and take into account and identify perspectives of the various stakeholders. The environmental assessment is intended to enhance a program's focus on factors that may affect future performance, rather than a focus solely on past drivers of, and barriers to, program success. During the self-study, the self-study committee should also record and celebrate key strengths. Information on program strengths is reported to the ACGME in the self-study summary. In contrast, information on areas for improvement is treated as confidential quality improvement information, and is not shared with the ACGME. At the conclusion of the self-study, program leaders are asked to upload a summary of their self-study through the Accreditation Data System. This succinct document includes all dimensions of the self-study, with the exception of areas for improvement.
Collecting and Showcasing Improvements Made as a Result of the Self-Study
A program's 10-year site visit is scheduled 12 to 18 months after the program has uploaded its self-study summary. The ACGME inserted the added time period to allow programs to make improvements prior to the site visit. Priorities determined by program leadership and stakeholders should guide the selection of the areas for improvement, which should link to program aims and context explored during the self-study. The rare exception will be when a program has an active citation that still needs to be resolved or has identified areas of improvement for which program leaders deem that the program currently is not meeting the accreditation standards.
For the 10-year site visit, program leadership will prepare a summary of achievements, designed to record and present improvements in areas identified during the self-study. For some longer-term objectives, the effort to demonstrate improvements will benefit from the self-study group's efforts to identify leading indicators that can be assessed in the 12- to 18-month time frame between the self-study and the 10-year site visit. An example of this is the finding that improvement in in-training examination scores is an early (leading) indicator of improvement in board examination performance (the lagging indicator).4 The ACGME will not ask programs to provide any information on areas identified during the self-study that have not yet resulted in improvements. The ACGME expects that many performance deficiencies in areas with an accreditation standard would already have been identified through the annual data review in the most recent period prior to the 10-year site visit.
Feedback on Improvement as a Process
During the 10-year visit, both feedback from site visitors at the conclusion of the site visit and subsequent RC feedback in the letter of notification will focus on the improvement process, not the areas for improvement the given program has chosen. In addition, RC feedback will be solely formative for at least 5 to 7 years to allow the RCs, the ACGME, and the graduate medical education (GME) community to gain new knowledge about robust ways to make improvements. The ACGME plans to collect information on best practices that are transferable to other programs, with attention to both general practices applicable to all programs and activities suited to subgroups, such as programs in a given specialty, very large core programs, clinically focused 1-year subspecialty programs, and other relevant groups.
A Simple Tool to Assess the Robustness of Programs' Improvement Processes
To provide for a more robust approach to given feedback on program improvement processes resulting from the annual program evaluation and the self-study, ACGME staff who are involved with self-study pilot visits for more than 300 programs designed a simple evaluation tool. The Program Improvement Assessment Tool (PIAT) consists of 4 dimensions relevant to program improvement (box).
- 1
Linking improvements to program aims and environmental context
- 2
Executing the plan-do-study-act (PDSA) cycle
- 3
Management and tracking of improvement data
- 4
Stakeholder involvement and engagement in improvement activities
The 4 dimensions reflect both core knowledge in improvement science, as well as existing ACGME program requirements, such as the requirement for the tracking of action plans resulting from the annual program evaluation. The first dimension, linking improvements to program aims and context, addresses a core concept of the self-study that seeks to foster improvement in areas that are most relevant to an intentional design of trainees' learning environment and learning experience. The second dimension, executing the plan-do-study-act (PDSA) cycle, is a core expectation in program improvement, codified in the ACGME Common Program Requirements,5 with data from the assessment of programs' current annual program evaluations showing that early or inadequate efforts often are characterized by improvement cycles arrested at the “plan” phase. The third dimension is critical to producing a track record of improvements. The key benefit is that it allows program leadership and stakeholders to understand what improvements have been made and documented over time, if further refinements are needed, and what data may constitute early indications of pending future improvement. The ultimate goal is a record of multiple years of improvement. Last but not least, the fourth dimension relates to stakeholder involvement in the improvement process, including stakeholder input in prioritizing areas for improvement, with the intent to increase the robustness, utility, and relevance of a given program's improvement activities.
Ultimately, each of the 4 dimensions will have 5 levels that indicate the maturity of the improvement process. Feedback using the PIAT is intended to get a program's improvement effort “to the next level.” At each level there are specific, actionable elements for consideration by the program. This aspect of the tool is currently undergoing validation, in which 2 phases of validation study are planned. The first entails establishing content validity using review feedback from quality improvement experts; the second phase will entail field testing on the remaining programs in the ACGME's self-study pilot. It is anticipated that the final tool will be available in early summer 2017.
The PIAT is intended for 3 types of use: (1) by programs conducting a self-assessment of the maturity of their improvement efforts; (2) to provide a shared mental model on improvement for use by ACGME field staff to provide actionable feedback at the conclusion of the 10-year site visit; and (3) for consideration by RCs in providing formative feedback on improvement in programs' letters of notification.
Three Anticipated Benefits of the Program Self-Study
The ACGME's approach to the self-study along with the 10-year site visit is expected to have 3 important benefits for the GME community. The first is that it will foster improvement and excellence in GME enterprise at the national level, representing a significant gain over a goal of mere compliance with minimum standards. The second benefit is that by having programs set aims as part of their program evaluation and self-study, as well as define activities that will further these aims, they will engage in more intentional design, with competencies and skills needed and desired by future graduates and the health care needs of patients and populations serving as key considerations. This will contribute to a GME system that meets the expectations of patients, health care systems, and the public, as well as trainees' expectations and considerations in selecting a program. The third benefit is that, over time, the GME community and the ACGME will glean and disseminate information on the aims and improvement priorities of programs within and across accredited specialties and subspecialties. This information has not existed until now, and it is envisioned to be of high value to further focus and accelerate improvements in GME.
The ACGME plans to continue to disseminate information through its website, meetings and webinars, and the Journal of Graduate Medical Education about the self-study and the 10-year accreditation site visit, including associated learning and best practices for adoption or adaptation.
References
Author notes
Ingrid Philibert, PhD, MBA, is Senior Vice President, Field Activities, Accreditation Council for Graduate Medical Education, and Executive Managing Editor, Journal of Graduate Medical Education.