Background 

There is an unmet need for formal curricula to deliver practice feedback training to residents.

Objective 

We developed a curriculum to help residents receive and interpret individual practice feedback data and to engage them in quality improvement efforts.

Methods 

We created a framework based on resident attribution, effective metric selection, faculty coaching, peer and site comparisons, and resident-driven goals. The curriculum used electronic health record–generated resident-level data and disease-specific ambulatory didactics to help motivate quality improvement efforts. It was rolled out to 144 internal medicine residents practicing at 1 of 4 primary care clinic sites from July 2016 to June 2017. Resident attitudes and behaviors were tracked with presurveys and postsurveys, completed by 126 (88%) and 85 (59%) residents, respectively. Data log-ins and completion of educational activities were monitored. Group-level performance data were tracked using run charts.

Results 

Survey results demonstrated significant improvements on a 5-point Likert scale in residents' self-reported ability to receive (from a mean of 2.0 to 3.3, P < .001) and to interpret and understand (mean of 2.4 to 3.2, P < .001) their practice performance data. There was also an increased likelihood they would report that their practice had seen improvements in patient care (13% versus 35%, P < .001). Run charts demonstrated no change in patient outcome metrics.

Conclusions 

A learner-centered longitudinal curriculum on ambulatory patient panels can help residents develop competency in receiving, interpreting, and effectively applying individualized practice performance data.

What was known and gap

Physicians are expected to review and analyze performance data to execute practice-based improvement for their patients, but few residency programs have published the frameworks used to design and implement curricula addressing practice feedback.

What is new

A curriculum to help residents receive and interpret individual practice feedback data and to engage them in quality improvement efforts.

Limitations

Surveys lacked validity evidence, and curriculum was implemented in one residency program, which may limit generalizability.

Bottom line

The curriculum helped residents develop competency in receiving, interpreting, and effectively applying individualized practice performance data.

Physicians are expected to review and analyze performance data to execute practice-based improvement for their patients.1  As national policies such as the Medicare Access and CHIP Reauthorization Act of 2015 and practice recognition programs such as the National Committee for Quality Assurance Patient-Centered Medical Home have created incentives to increase ambulatory care quality across the country, the Accreditation Council for Graduate Medical Education (ACGME) has similarly aligned its objectives.2,3  Within the Practice-Based Learning and Improvement (PBLI) core competency, the ACGME has identified the subcompetency of improving via performance audit to train the next generation of clinicians to deliver high-quality, efficient health care.4 

Most resident feedback studies have focused on inpatient performance metrics; few have utilized ambulatory population health metrics. Interventions that provided residents with practice feedback in conjunction with educational sessions, self-reflection, and involvement in quality improvement have had the most success in improving both process and clinical outcome measures,59  while those that provided residents with their data in isolation have been less successful.10,11  However, few programs have published the frameworks they used to design and implement a longitudinal and multimodal curriculum addressing practice feedback.5,9 

Implementing a practice-based improvement curriculum requires accurate, resident-specific performance outcomes for patients. Previously published PBLI efforts have often relied on manual chart review because of difficulties accessing and automatically compiling personalized resident-level data from the electronic health record (EHR).5,1215  With increasing EHR experience and usability, health care systems have an opportunity to provide more detailed, extensive, and frequent data to physicians.16 

To our knowledge, this is the first described longitudinal residency curriculum to use a structured framework and individualized EHR-level data to guide how residents receive practice feedback. We aimed to design a curriculum that would help residents receive and interpret data on their patient panels, engage them in quality improvement efforts, and prepare them for the practice feedback they will likely receive throughout their careers.

Setting and Participants

The initial year of the program was conducted with 144 internal medicine residents (both categorical and primary care residents) from July 2016 to June 2017. The continuity clinic sites included 2 hospital-based clinics, a community-based practice, and a Veterans Affairs (VA) clinic.

Study Design

The curriculum incorporated opportunities for residents to engage in the 5 elements of PBLI: responsibility for a panel of patients, auditing that panel based on evidence-based criteria, comparing the audit to benchmarks to explore potential deficiencies (and successes), identifying areas for change, and engaging in a quality improvement intervention.4 

Key curricular design features included the following:

  • Longitudinal feedback provided at multiple points in time

  • A learner-centered approach that includes built-in self-reflection, individual goal setting and quality improvement activities, and individualized faculty coaching

  • Multimodal activities ranging from large group discussions to one-on-one coaching

  • Curriculum complementary to existing outpatient didactic curriculum and clinical practice

All participants were provided study information sheets, and anonymous survey participation was optional. Surveys were developed by the authors without further testing.

Framework

Our framework for designing the curriculum included 5 key elements:

Resident Attribution

Accurate identification of a resident's panel of patients is necessary to create a sense of ownership and responsibility for that panel. In order to capture as many of the patients our residents were caring for as possible, our only requirement for attribution was that the resident was listed in the primary care physician field in the EHR.

Metrics Selection

We chose metrics that (1) residents feel they have the power to impact, (2) have a large enough denominator in small resident panels, (3) offer the opportunity for disease-based teaching, and (4) align with institutional quality improvement goals to allow residents to coordinate with larger-scale improvement efforts. Our initial metrics were blood pressure control in patients with hypertension and colorectal cancer screening for indicated patients. We utilized the practice feedback intervention suggestions outlined by Brehaut and colleagues17  as guidance for metrics delivery, including highlighting specific goals, providing individual data with comparators, addressing the credibility of the information, and preventing defensive reactions.

Faculty Coaching

Faculty coaching was the backbone of the curriculum. Faculty initially helped residents address the accuracy of their results and understand what the results implied for their practice patterns and behaviors. They later worked one-on-one with residents to identify potential opportunities for change and specific steps for utilizing their clinic team to help optimize care for their panel. Prior to the sessions, all faculty mentors received in-person training on the data delivery system and educational goals. They also received reference materials, residents' completed self-assessments, and examples of individualized coaching.

Peer and Cross-Site Comparisons

Peer comparisons allowed for reflection on when it may be appropriate to have outlying performance and when it is a learning opportunity: for example, when lower rates of colorectal cancer screening reflect a patient panel with more barriers to screening compared with another clinic, or when a resident's outcomes are sharply different from a peer's outcomes even with similar clinics/populations. In small group clinic sessions, high-performing residents shared strategies they used in real time. Large group discussions allowed residents to discuss differences among sites and review clinic processes to replicate success. Posters were created for each clinic workroom to increase data visibility, display clinic-level trends, and recognize top performers.

Quality Improvement Focus

Residents used a self-assessment worksheet to reflect on their performance, set personal goals, and identify individual-level and systems-level interventions to help improve their performance. We encouraged residents to use techniques used in didactics to prioritize potential improvements, including strategic prioritization and the Impact vs. Effort Matrix.18  This served to counteract the tendency of residents to focus on individual interventions and to instead consider team- and system-level interventions.

The Curriculum

We implemented this curriculum over the course of an academic year and have replicated it in subsequent years. The formal education components are outlined in the table.

table

Key Curricular Activities With Relevant Education Components Linked to Goals Within the Educational Framework

Key Curricular Activities With Relevant Education Components Linked to Goals Within the Educational Framework
Key Curricular Activities With Relevant Education Components Linked to Goals Within the Educational Framework

Analysis

We sent electronic presurveys and postsurveys to residents and asked them to self-report how frequently they engaged in practice feedback and in panel management activities and whether they thought reviewing data was useful in improving practice patterns and quality of care. We analyzed survey data using summative statistics, chi-square tests, and paired t tests, as appropriate. In the postsurveys, we also solicited written feedback on the curriculum. As initial educational process outcomes, we tracked how frequently residents logged in to view their data, the percentage of residents who attended educational activities, and self-assessment completion rates. We also tracked time spent by residents, faculty, and coordinators. Group-level performance on initial patient metrics was readily available for 3 of 4 clinics and was tracked using run charts.

Our institutional review board determined that this project was a quality improvement effort that did not require full review.

More than 90% of residents participated in each of the outlined curricular activities and 100% (144 of 144) completed the self-assessment that asked them to access their personal data at least once.

Survey Data

A total of 88% (126 of 144) of residents completed the presurvey and 59% (85 of 144) of residents completed the postsurvey. Presurveys and postsurveys demonstrated significant improvements on a 5-point Likert scale in residents' self-reported ability to receive (from a mean of 2.0 to 3.3, P < .001) and to interpret and understand (mean of 2.4 to 3.2, P < .001) their practice performance data. They also showed significant improvement on receiving coaching for how to improve (mean of 2.4 to 3.2, P < .001) their practice performance data.

Self-reported application of these skills into clinical practice also increased. Although residents most often reported never for all 3 behaviors in presurveys, figures 1 through 3 show the increased frequency with which residents reported the following: (1) looking up practice performance data (percentage responding sometimes or frequently increasing from 16% [20 of 126] to 64% [54 of 85], P < .001); (2) using that data to identify opportunities for change (15% [19 of 126] to 60% [51 of 85], P < .001); and (3) adjusting their workflow or clinic processes to help improve practice performance (26% [33 of 126] to 64% [54 of 85], P < .001).

figure 1

Resident Self-Reported Frequencies of Looking Up Practice Performance Data Within Patient Panels (Presurvey Versus Postsurvey Data)

figure 1

Resident Self-Reported Frequencies of Looking Up Practice Performance Data Within Patient Panels (Presurvey Versus Postsurvey Data)

figure 2

Resident Self-Reported Frequencies of Using Practice Performance Data to Identify Clinical Behaviors That They Needed to Change (Presurvey Versus Postsurvey Data)

figure 2

Resident Self-Reported Frequencies of Using Practice Performance Data to Identify Clinical Behaviors That They Needed to Change (Presurvey Versus Postsurvey Data)

figure 3

Resident Self-Reported Frequencies of Adjusting Their Work Flows or Clinic Processes to Help Improve Practice Performance (Presurvey Versus Postsurvey Data)

figure 3

Resident Self-Reported Frequencies of Adjusting Their Work Flows or Clinic Processes to Help Improve Practice Performance (Presurvey Versus Postsurvey Data)

Resident perceptions of the utility and impact of reviewing practice performance data also changed. The number of residents who agreed or strongly agreed that reviewing practice performance data is useful to improve practice patterns did not change significantly (72% [91 of 126] to 82% [70 of 85], P = .09). The number who agreed or strongly agreed that their practice had seen improvements in patient care by reviewing practice performance data increased from 13% (16 of 126) to 35% (30 of 85, P < .001).

Outcomes

Resident log-ins were able to be tracked at 3 of our 4 clinic sites (58%, 84 of 144 residents) and increased in parallel with curricular activities throughout the year (figure 4). Group-level performance on the initial 2 metrics was readily available at the same 3 sites. Run charts demonstrated stability in colon cancer screening rates and hypertension control over the course of the intervention, as well as nonrandom variation in the form of a shift in the data toward higher colorectal cancer screening rates later in the year and in the first few months of postintervention follow-up (figure 5a and b).

figure 4

Resident Log-Ins to Electronic System Containing Personal Performance Dataa

a Log-ins were tracked for 3 of our 4 clinic sites in 5-week blocks through the academic year. Log-ins increased with initial introduction of the data, peaked when there was a required self-assessment to complete, and remained higher than baseline throughout the remainder of the year.

figure 4

Resident Log-Ins to Electronic System Containing Personal Performance Dataa

a Log-ins were tracked for 3 of our 4 clinic sites in 5-week blocks through the academic year. Log-ins increased with initial introduction of the data, peaked when there was a required self-assessment to complete, and remained higher than baseline throughout the remainder of the year.

figure 5a and b

Aggregate Clinic Level Dataa

a Data were from 3 of our 4 clinic sites of percentage of patients with hypertension whose blood pressure was controlled (most recent reading < 140/90) and percentage of patients who were up to date on colorectal cancer screening, tracked monthly.

figure 5a and b

Aggregate Clinic Level Dataa

a Data were from 3 of our 4 clinic sites of percentage of patients with hypertension whose blood pressure was controlled (most recent reading < 140/90) and percentage of patients who were up to date on colorectal cancer screening, tracked monthly.

Acceptability

Both resident and faculty acceptability were high, with enthusiasm about the availability of data and tools to help with data interpretation. Residents suggested a variety of additional metrics for which they wanted future feedback and expressed interest in using their data to drive quality improvement projects. Most frustrations centered on technical problems with data accessibility or accuracy of panel identification. Faculty were supportive of the framework and willing to devote curricular time to coach the residents. Residents and faculty thought further faculty training with additional resources and experience could be helpful.

Feasibility

The support of residency leadership and clinic site directors, as well as curricular flexibility in the 4+1 scheduling model, made the curriculum feasible. Relatively little curricular time was used (1 hour of ambulatory didactic lecture time and two 30-minute sessions of preclinic educational time). Residents were expected to do a small amount of practice feedback work (such as completing the self-assessment, which took 15 minutes on average) during their half-day of administrative time. Faculty development included a 45-minute meeting with ambulatory associate program directors and clinic site directors. Each of the 10 faculty clinic champions also had a 20-minute one-on-one session with a chief resident to review curricular goals and resident data. A nurse clinical quality specialist devoted approximately 10 hours to data management and analysis over the 1-year period.

We were able to use a structured framework to guide the implementation of a longitudinal curriculum centered on residents' ambulatory patient panels that was feasible to add to our residency curriculum without significant additional learner or instructor time and with high levels of resident and faculty acceptability. Residents reported significant improvements in their ability to receive, interpret, and understand practice feedback. They logged in to access their data more frequently and had high levels of participation in curricular activities. Patient outcomes for the chosen metrics did not change among our resident patient panels.

To our knowledge, this is the first described longitudinal residency curriculum to use a structured framework and individualized EHR-level data to guide how residents receive practice feedback. Prior studies of resident practice feedback interventions have relied on manual chart review, which can provide meaningful feedback but is more time-intensive and less replicable for a larger number of quality measures over time.5,1115  This framework was designed specifically to frame messaging for residents around acting on clinically meaningful valid metrics to help improve the quality of care they deliver and to overcome some of the typical challenges to practice feedback. Such challenges include those generalizable to all physicians (adequate time, data accuracy, and systems support to help physicians utilize data to effect change) and those unique to residents (small panel sizes, varied clinic settings, competing educational objectives).

We used strategies highlighted by 2 reviews that indicated practice feedback is most effective at improving practice when provided multiple times, combined with other interventions (eg, education, guidelines, reminders), and tied to specific goals and action plans.19,20  Prior studies of resident practice feedback interventions have found largely similar conclusions; practice feedback data in isolation is less effective at affecting quality outcomes1012  than those with multifaceted interventions.59,21 

Despite modeling these proven strategies, patient outcomes remained largely stable, similar to prior published research demonstrating inconsistent effects of practice feedback on outcomes.19  However, run charts did demonstrate a nonrandom trend toward improved outcomes when including the first few months of postintervention follow-up. Prior resident studies with improved clinical outcomes have largely seen those improvements over the course of 2 or more years.15,21  More time is likely needed to determine if improved educational outcomes also translate to improved patient outcomes.

The curriculum was implemented in a single residency program and may not be generalizable, although it was successful within a large academic program with multiple ambulatory clinics and 2 EHR systems. The surveys had no validity evidence, thus respondents may have interpreted questions differently than intended. We also did not have long-term data on how residents view practice performance over the course of their residency or, more importantly, their careers.

In the future we hope to combine the practice feedback framework with an ambulatory quality improvement curriculum to help motivate data-driven individual and group efforts to improve patient outcomes. Further research is needed to see if similar success can be obtained in other programs, including those in different specialties that also have ambulatory patient panels. Most importantly, long-term research is needed to see if these efforts successfully prepare residents to receive and effectively use practice feedback data throughout their careers.

A longitudinal practice feedback curriculum that used EHR-generated provider-level data complemented an ambulatory didactic curriculum to help residents develop PBLI competencies and identify both individual and large-scale opportunities for quality improvement.

1
American Board of Medical Specialties
.
Board certification. A trusted credential. Based on core competencies
. ,
2019
.
3
National Committee for Quality Assurance
.
Patient-centered medical home (PCMH)
. ,
2019
.
4
Accreditation Council for Graduate Medical Education; American Board of Internal Medicine
.
The Internal Medicine Milestone Project. July 2015
. ,
2019
.
5
Holmboe
ES,
Prine
L,
Green
M.
Teaching and improving quality of care in a primary care internal medicine residency clinic
.
Acad Med
.
2005
;
80
(
6
):
571
577
.
6
Thomas
KG,
Thomas
MR,
Stroebel
RJ,
McDonald
FS,
Hanson
GJ,
Naessens
JM,
et al.
Use of a registry-generated audit, feedback, and patient reminder intervention in an internal medicine resident clinic—a randomized trial
.
J Gen Intern Med
.
2007
;
22
(
12
):
1740
1744
. doi:.
7
Boggan
JC,
Swaminathan
A,
Thomas
S,
Simel
DL,
Zaad
AK,
Bae
JG.
Communication of results in ambulatory clinics utilizing a web-based audit and feedback module
.
J Grad Med Educ
.
2017
;
9
(
2
):
195
200
. doi:.
8
Rand
CM,
Schaffer
SJ,
Dhepyasuwan
N,
Blumkin
A,
Albertin
C,
Serwint
JR,
et al.
Provider communication, prompts, and feedback to improve HPV vaccination rates in resident clinics
.
Pediatrics
.
2018
;
141
(
4
).
pii
:e20170498. doi:.
9
Flattery
D,
Berry
L,
Subramanian
I,
Gregory
B,
Nelson
N,
Wofsy
J.
Engaging residents in population-based care through a panel management curriculum
.
Acad Intern Med Insight
.
2016
;
14
(
1
):
7
9
.
10
Simon
SR,
Soumerai
SB.
Failure of internet-based audit and feedback to improve quality of care delivered by primary care residents
.
Int J Qual Health Care
.
2005
;
17
(
5
):
427
431
. doi:.
11
Kogan
JR,
Reynolds
EE,
Shea
JA.
Effectiveness of report cards based on chart audits of residents' adherence to practice guidelines on practice performance: a randomized controlled trial
.
Teach Learn Med
.
2003
;
15
(
1
):
25
30
. doi:.
12
Holmboe
E,
Scranton
R,
Sumption
K,
Hawkins
R.
Effect of medical record audit and feedback on residents' compliance with preventive health care guidelines
.
Acad Med
.
1998
;
73
(
8
):
901
903
.
13
Carney
AP,
Eiff
MP,
Green
LA,
Carraccio
C,
Smith
DG,
Pugno
PA,
et al.
Transforming primary care residency training: a collaborative faculty development initiative among family medicine, internal medicine, and pediatric residencies
.
Acad Med
.
2015
;
90
(
8
):
1054
1060
. doi:.
14
Kern
DE,
Harris
WL,
Boekeloo
BO,
Randol Barker L, Hogeland P. Use of an outpatient medical record audit to achieve educational objectives: changes in residents' performances over six years
.
J Gen Intern Med
.
1990
;
5
(
3
):
218
224
.
15
Hildebrand
C,
Trowbridge
E,
Roach
MA,
Sullivan
AG,
Broman
AT,
Vogelman
B.
Resident self-assessment and self-reflection: University of Wisconsin-Madison's five-year study
.
J Gen Intern Med
.
2009
;
24
(
3
):
361
365
. doi:.
16
Clarke
R,
Hackbarth
AS,
Saigal
C,
Skootsky
SA.
Building the infrastructure for value at UCLA: engaging clinicians and developing patient-centric measurement
.
Acad Med
.
2015
;
90
(
10
):
1368
1372
. doi:.
17
Brehaut
JC,
Colquhoun
HL,
Eva
KW,
Carroll
K,
Sales
A,
Michie
S,
et al.
Practice feedback interventions: 15 suggestions for optimizing effectiveness
.
Ann Intern Med
.
2016
;
164
(
6
):
435
441
. doi:.
18
American Society for Quality
.
Quality in Healthcare: Impact Effort Matrix
. ,
2019
.
19
Ivers
N,
Jamtvedt
G,
Flottorp
S,
Young
JM,
Odgaard-Jensen
J,
French
SD,
et al.
Audit and feedback: effects on professional practice and patient outcomes
.
Cochrane Database Syst Rev
.
2012
;(
6
):CD000259.
20
Veloski
J,
Boex
JR,
Grasberger
MJ,
Evans
A,
Wolfson
DB.
Systematic review of the literature on assessment, feedback and physicians' clinical performance: BEME Guide No. 7
.
Med Teach
.
2006
;
28
(
2
):
117
128
. doi:.
21
Houston
TK,
Wall
T,
Allison
JJ,
Palonen
K,
Willett
LL,
Keife
CI,
et al.
Implementing achievable benchmarks in preventive health: a controlled trial in residency education
.
Acad Med
.
2006
;
81
(
7
):
608
616
. doi:.

Author notes

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: Dr Gupta is the Director of Evaluation and Outreach at Costs of Care Inc.

The authors would like to thank Drs Jodi Friedman, Lisa Skinner, Christina Harris, Mina Ma, Peter LeFevre, Anna Chirra, and Allison Diamant for their support, suggestions, and time in implementing this curriculum. The authors would also like to thank Meghan Nechrebecki, Sean Furlong, and Vilay Khandewal for their instrumental assistance in data acquisition and troubleshooting.

Supplementary data