Abstract

Background

Systematically engaging residents in large programs in quality improvement (QI) is challenging.

Objective

To coordinate a shared QI project in a large residency program using an online tool.

Methods

A web-based QI tool guided residents through a 2-phase evaluation of performance of foot examinations in patients with diabetes. In phase 1, residents completed reviews of health records with online data entry. Residents were then presented with personal performance data relative to peers and were prompted to develop improvement plans. In phase 2, residents again reviewed personal performance. Rates of performance were compared at the program and clinic levels for each phase, with data presented for residents. Acceptability was measured by the number of residents completing each phase. Feasibility was measured by estimated faculty, programmer, and administrator time and costs.

Results

Seventy-nine of 86 eligible residents (92%) completed improvement plans and reviewed 1471 patients in phase 1, whereas 68 residents (79%) reviewed 1054 patient charts in phase 2. Rates of performance of examination increased significantly between phases (from 52% to 73% for complete examination, P < .001). Development of the tool required 130 hours of programmer time. Project analysis and management required 6 hours of administrator and faculty time monthly.

Conclusions

An online tool developed and implemented for program-wide QI initiatives successfully engaged residents to participate in QI activities. Residents using this tool demonstrated improvement in a selected quality target. This tool could be adapted by other graduate medical education programs or for faculty development.

What was known

Lack of a feasible platform to manage and track improvement projects is a challenge in teaching residents practice-based learning and improvement.

What is new

An online tool for program-wide quality improvement initiatives resulted in improvement in a selected quality target.

Limitations

Data were self-reported; clinical improvements may be due to other elements of a larger educational intervention.

Bottom line

The online quality improvement tool was feasible and well-accepted by residents and could be adapted for other improvement targets.

Editor's Note: The online version of this article contains examples of performance improvement plans from residents.

Introduction

Resident quality improvement (QI) involvement fulfills practice-based learning and improvement requirements and develops skills for future practice analysis.1 Several articles have identified programs that have involved residents in QI projects.25 Frequently cited challenges include facilitating faculty time, training, or funding; multiple competing educational and clinical demands; and voluntary participation of only a subset of residents.6,7 Electronic health record limitations in aggregating data and providing performance reports may also limit effectiveness.

Few large residency programs (those with more than 40 residents) have published results of QI efforts engaging all residents. Programs that have involved large numbers of residents in QI have shared small group QI projects.811 Small groups encourage resident ownership; however, they may not engage trainees universally and may have sustainability concerns given the number of projects generated. Multiple projects may also be difficult to track and document resident involvement for objective improvement and accreditation visits.

We describe a residency-wide project shared for 1 year in a large internal medicine program. A single project was undertaken to illustrate QI principles and to engage residents in a shared format using a novel, online tracking system, and we describe its development and implementation.

Methods

Setting and Participants

The Duke Medicine Residency program includes 41 categorical and 9 preliminary interns and 86 categorical residents.

QI Project Development and Online Management

Categorical residents participate in 1 of 3 (A, B, and C) continuity clinics. During postgraduate year (PGY)–1, all trainees participated in an online curriculum teaching basic QI vocabulary and processes. We developed an online, interactive experience using Microsoft SharePoint, an online collaboration tool, to guide PGY-2 and PGY-3 residents through an audit and feedback evaluation of performance around a quality metric chosen by program and resident leadership.

The online project occurred in 2 phases relative to the resident's creation of an “aims” statement. In phase 1, residents reviewed a web-based, educational module describing the online tool, the current project and metric, information on creating aims statements, and use of the Plan-Do-Study-Act cycle.12 Residents then completed chart audits with online data entry. The database underlying the QI module is housed on a password-protected, internal Duke server. All data entered into the online tool are deidentified and did not include any protected health information.

During phase 1, residents reviewed the components of the complete foot examination of patients with diabetes and the underlying evidence as a slideshow embedded in the module. Residents then completed retrospective health record reviews for a minimum of 15 different patients with diabetes seen recently in their continuity clinic for nonacute visits and entered data into the QI interface on the foot examination components that had been completed and documented for each patient. Residents then were shown graphs of personal examination completion rates, as well as the aggregate performance of their peers at other clinics and the overall program for comparison (figure). To complete phase 1, residents developed an online individual performance improvement plan, including an aims statement and “next steps.”

In phase 2, which occurred in the second half of the academic year, residents reviewed their performance online, as in phase 1. Rather than create an improvement plan, residents commented on project successes, barriers to improvement, and next steps. Phase 2 was scheduled a minimum of 3 months from completion of phase 1 to allow residents time to implement improvement plans. Residents were provided 1 protected half-day during ambulatory blocks to complete each project phase. Phases could be completed at any location using virtual servers with access to electronic health records.

For the 2012–2013 academic year, the chosen QI measure was completion of foot examinations for patients with diabetes, including skin integrity, vascular, and monofilament examinations, as well as a “bundled” foot examination that incorporated all 3 component examinations. The completion rate of appropriate foot examinations was chosen because this measure had been tracked during academic year 2011–2012 at the suggestion of clinic faculty, providing baseline data for comparison. Throughout the academic year, faculty leaders presented clinic and program-level data to residents. Those data were distributed monthly to residents through lectures and program newsletters as well as online announcements.

This study was approved by the Duke University Health System Institutional Review Board as exempt research given its use of deidentified data (Pro00046609).

Outcomes Measures and Analysis

Acceptability of the SharePoint QI interface was measured by the number of residents completing each phase, the number of residents who completed both phases, and the total and average number of patients reviewed. The primary outcome was the change in the overall rate of bundled foot examination completions and documentation at the resident level between phases. Secondary outcomes included rates of component examinations at the resident level and bundled examinations at the clinic level. Feasibility was documented with estimates of programmer, administrative, and faculty costs. The rates of performance and documentation of foot, vascular, skin integrity, and monofilament examinations, all 3 nonfoot examinations in aggregate, and the number of patients examined by each resident and clinic were summarized with descriptive statistics.

Resident-specific rates of all examinations were studied by phase, with subgroup analysis by clinic. To evaluate the effect of the QI intervention, the resident-specific change in examination rates during the intervention period (phase 2–phase 1) was calculated, with sensitivity analysis at the clinic level. Wilcoxon signed rank tests for nonparametric, paired samples were used to compare mean examination rates by phase in aggregate and within each clinic. In addition, χ2 tests were used to compare the distribution of participation by clinic and phase. Analysis of variance was used to compare the change in resident examination rates by clinic, with post hoc (Tukey) pairwise comparisons conducted as appropriate.

A 2-sided significance level of 0.05 was used for all statistical tests. Statistical analyses were conducted using SAS version 9.3 software (SAS Institute Inc).

Results

Acceptability

In Phase 1 (July 1, 2012, through December 31, 2012), 79 of 86 eligible residents (92%) reviewed 1471 medical records (average, 18.6 records/resident). During phase 2 (January 1, 2013, through June 30, 2013), 68 of 86 residents (79%) reviewed 1054 medical records (average, 15.5 records/resident), and 65 of 86 residents (76%) completed both phases. Individual improvement plan examples are shown in the 5a1abox.

FIGURE

Sample Phase-1 Online Project Work Flow Box

FIGURE

Sample Phase-1 Online Project Work Flow Box

box Example of Resident Performance-Improvement Plan

“Aim” Statement: I will increase my diabetic foot compliance (which includes all 3 components) by 10% in the next 3 months.

PLAN

I will flag all patients in my daily panel (excluding acute-care clinics) with diabetes and assess when they last received a FULL diabetic foot examination (including visual, monofilament, and vascular) and whether 1 or any component was missing in the past year. I will perform each [examination] at that visit [if missing] and document this in the [electronic health record].

TASKS

  1. Check patient panel for diabetes mellitus, and flag those who are diabetic.

  2. Review chart for all 3 components of examination: visual, monofilament, and vascular.

  3. Perform examination if all 3 have not been completed.

  4. Document in note.

  5. Document in health care maintenance tab to make it easy to find for future physicians.

PREDICTION

I will increase my rate of compliance with diabetic foot examination by 10% and have better documentation of each one.

Foot Examination Performance

Residents' completion rates of the foot examination bundle improved significantly after the intervention. At baseline, residents documented a complete foot examination in 52% (761 of 1471) of patients with diabetes, which increased to 73% (766 of 1054) after the intervention (40% improvement, P < .001). Aggregate resident foot examination rates are shown in the table. Participation did not differ significantly between clinics between phases.

TABLE

Participant Characteristics and Diabetic Foot Examination Rates by Type and Phase

Participant Characteristics and Diabetic Foot Examination Rates by Type and Phase
Participant Characteristics and Diabetic Foot Examination Rates by Type and Phase

All examination performance rates increased between phases. Completion rates of all component examinations and the aggregate examination increased significantly between phases for both the A and B clinics, which are the continuity clinics for 88% (76 of 86) of upper-level residents (data not shown). Only the rate of performing all aggregate examinations together was significantly different between phases for clinic C (data not shown). The improvement in performance rates of all 3 examinations (phase 2–phase 1) were not significantly different across clinic sites (22% versus 20% versus 16%, P  =  .76).

Development and Administrative Costs

The costs of designing and implementing the tool and integrating the educational material were primarily related to programming, totaling 130 hours at an estimated cost of $100/h. Office staff tracked completion and sent reminders to residents, requiring 1 h/wk, which was done as part of regular work assignments. One faculty member (J.B.) spent 1 to 2 h/mo to compile and distribute data to residents, program leadership, and clinic preceptors in lectures and online formats.

Discussion

Overall, our audit and feedback module successfully engaged a large group of residents in an individual QI activity and enabled program leadership to measure participation and performance. To date, our interface is the only described tool, to our knowledge, to facilitate a single, shared project in a large program, and to track resident participation and provide real-time feedback.811 As the costs and support for the project become lower after initial development of the interface and website housing it, we believe a similar system could be adopted by other programs for distribution of performance data.

At the individual level, the interface allows residents to track personal performance and view peer comparisons for a selected quality target. We anticipate feedback of resident performance data will be central to future accreditation processes, which will require programs to ascertain resident performance and share it meaningfully. This tool provides residents' data in relative real-time, increasing the project's educational and QI impact. Furthermore, the interface could be queried during real-time, workplace-based assessments for the competencies of patient care and practice-based learning and improvement.

We anticipate our QI tool will be translatable to other projects. This interface is currently being adapted for a continuity clinic laboratory result follow-up project chosen by a group of residents and faculty. This project employs a similar audit and feedback format that could easily be adapted for additional projects or by other graduate medical education programs. Another potential use would be for clinical departments to track faculty projects for maintenance of certification or performance.

Clinically, the rates of performing visual inspection, monofilament examinations, and vascular examinations separately—and all 3 examinations together—increased after residents developed their individual performance improvement plans. This finding suggests implementing this type of online QI module may be an effective method for improving residents' foot examination of patients with diabetes. Unlike other resident QI projects involving diabetes metrics, this project was targeted to a large group and allowed for electronic recording and presentation of chosen goals.1316 Application of the tool to other QI measures needs further exploration, as well as repeat auditing, to assess whether gains are sustained. Future projects could also include mentoring by faculty to provide residents with guidance on developing and implementing QI plans to enhance effectiveness.

Our study has several limitations. First, residents self-reported data, and we did not have resources available to validate the accuracy of resident-level data. A second limitation was that the creation of the interface required programming time and costs, which may not be feasible in all training settings. Third, the project focused on individual performance rather than a larger system. Finally, although we were able to review resident performance improvement plans, we were unable to provide direct mentorship on those plans, and it is possible that awareness of foot examinations in patients with diabetes, rather than the actual performance improvement plan, drove the increases shown.

Conclusion

A novel, online, educational audit and feedback tool was successful in engaging and documenting resident participation in a program-wide improvement effort and helped facilitate improvement in a large residency program for documentation of foot examination components for patients with diabetes. The tool was acceptable to most of the residents and is translatable for future projects.

References

1
Gould
BE
,
Grey
MR
,
Huntington
CG
,
Gruman
C
,
Rosen
JH
,
Storey
E
,
et al.
Improving patient care outcomes by teaching quality improvement to medical students in community-based practices
.
Acad Med
.
2002
;
77
(
10
):
1011
1018
.
2
Ogrinc
G
,
Headrick
LA
,
Mutha
S
,
Coleman
MT
,
O'Donnell
J
,
Miles
PV
.
A framework for teaching medical students and residents about practice-based learning and improvement, synthesized from a literature review
.
Acad Med
.
2003
;
78
(
7
):
748
756
.
3
Boonyasai
RT
,
Windish
DM
,
Chakraborti
C
,
Feldman
LS
,
Rubin
HR
,
Bass
EB
.
Effectiveness of teaching quality improvement to clinicians: a systematic review
.
JAMA
.
2007
;
298
(
9
):
1023
1037
.
4
Patow
CA
,
Karpovich
K
,
Riesenberg
LA
,
Jaeger
J
,
Rosenfeld
JC
,
Wittenbreer
M
,
et al.
Residents' engagement in quality improvement: a systematic review of the literature
.
Acad Med
.
2009
;
84
(
12
):
1757
1764
.
5
Wong
BM
,
Etchells
EE
,
Kuper
A
,
Levinson
W
,
Shojania
KG
.
Teaching quality improvement and patient safety to trainees: a systematic review
.
Acad Med
.
2010
;
85
(
9
):
1425
1439
.
6
Mosser
G
,
Frisch
KK
,
Skarda
PK
,
Gertner
E
.
Addressing the challenges in teaching quality improvement
.
Am J Med
.
2009
;
122
(
5
):
487
491
.
7
Varkey
P
,
Karlapudi
S
,
Rose
S
,
Nelson
R
,
Warner
M
.
A systems approach for implementing practice-based learning and improvement and systems-based practice in graduate medical education
.
Acad Med
.
2009
;
84
(
3
):
335
339
.
8
Oyler
J
,
Vinci
L
,
Arora
V
,
Johnson
J
.
Teaching internal medicine residents quality improvement techniques using the ABIM's practice improvement modules
.
J Gen Intern Med
.
2008
;
23
(
7
):
927
930
.
9
Oyler
J
,
Vinci
L
,
Johnson
JK
,
Arora
VM
.
Teaching internal medicine residents to sustain their improvement through the quality assessment and improvement curriculum
.
J Gen Intern Med
.
2011
;
26
(
2
):
221
225
.
10
Voss
JD
,
May
NB
,
Schorling
JB
,
Lyman
JA
,
Schectman
JM
,
Wolf
AM
,
et al.
Changing conversations: teaching safety and quality in residency training
.
Acad Med
.
2008
;
83
(
11
):
1080
1087
.
11
Weigel
C
,
Suen
W
,
Gupte
G
.
Using lean methodology to teach quality improvement to internal medicine residents at a safety net hospital
.
Am J Med Qual
.
2013
;
28
(
5
):
392
399
.
12
Langley
GL
,
Nolan
KM
,
Nolan
TW
,
Norman
CL
,
Provost
LP
.
The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. 2nd ed
.
San Francisco, CA
:
Jossey-Bass Publishers
;
2009
.
13
Fox
CH
,
Mahoney
MC
.
Improving diabetes preventive care in a family practice residency program: a case study in continuous quality improvement
.
Fam Med
.
1998
;
30
(
6
):
441
445
.
14
Coleman
MT
,
Nasraty
S
,
Ostapchuk
M
,
Wheeler
S
,
Looney
S
,
Rhodes
S
.
Introducing practice-based learning and improvement ACGME core competencies into a family medicine residency curriculum
.
Jt Comm J Qual Saf
.
2003
;
29
(
5
):
238
247
.
15
Holmboe
ES
,
Prince
L
,
Green
M
.
Teaching and improving quality of care in a primary care internal medicine residency clinic
.
Acad Med
.
2005
;
80
(
6
):
571
577
.
16
Halverson
LW
,
Sontheimer
D
,
Duvall
S
.
A residency clinic chronic condition management quality improvement project
.
Fam Med
.
2007
;
39
(
2
):
103
111
.

Author notes

All authors are in the Department of Medicine, Duke University Health System. Joel C. Boggan, MD, MPH, is Chief Resident for Quality and Safety, Durham Veterans Affairs Medical Center; George Cheely, MD, MBA, is Medical Director for Care Redesign, Duke Hospital Medicine; Bimal R. Shah, MD, MBA, is Director for Quality, Duke Heart Center; Randy Heffelfinger, MBA, is Administrator, Medicine Residency Program; Deanna Springall, MA, is Programming Consultant, Duke Health Technology Solutions; Samantha M. Thomas, MB, is Biostatistician, Duke Office of Clinical Research; Aimee Zaas, MD, MHS, is Program Director, Internal Medicine Residency Program, Division of Infectious Diseases; and Jonathan Bae, MD, is Assistant Program Director for Quality Improvement, Duke Hospital Medicine.

Funding: Statistical support for this project was supported by Agency for Healthcare Research and Quality grant number K12 HS 019479.

Conflict of Interest: The authors declare they have no competing interests.

The authors would like to thank the second- and third-year residents in the internal medicine residency program at Duke University for their participation. Ambulatory leadership, including the 3 clinic medical directors and the Ambulatory Care Leadership Track, were also instrumental in development and implementation. The medicine residency program office, including Lauren Dincher and Shawna Alkon, are also recognized for their contributions to this project's success. Lastly, we extend gratitude toward the Department of Medicine leadership, including Dr Mary Klotman and Joe Doty, for supporting the development of the tool.

Supplementary data