Communication errors have been identified as a contributing factor to preventable patient adverse events.18  Structured patient handoffs have been proposed as a method to reduce both communication errors and adverse events during transitions of care between health care providers. Systematic reviews have noted heterogeneity among studies of handoff methods, and noted the need for higher-quality studies examining individual, systems, and other factors associated with improvement in patient outcomes.1,2 

Two recent studies have examined the impact of the I-PASS (illness severity, patient summary, action list, situation, awareness and contingency plans, and synthesis by receiver) handoff bundle on adverse events in academic pediatrics inpatient populations. A study at a single teaching institution showed that implementation of the bundle was associated with a decrease in medical errors from 33.8 to 18.3 per 100 admissions, and a decrease in preventable adverse events from 3.3 to 1.5 per 100 admissions.3  A follow-up implementation study across 9 teaching hospitals found that the medical error rate decreased from 24.5 to 18.8 per 100 admissions, and that the rate of preventable adverse events decreased from 4.7 to 3.3 events per 100 admissions; however, significant changes occurred at only 6 of the 9 intervention sites.4 

Qualitative research methods are helpful in elucidating factors associated with successful (or unsuccessful) implementation endeavors, and identifying characteristics that enable sustainability and spread of improvement efforts. A study in this issue of the Journal of Graduate Medical Education by Coffey and colleagues5  describes the experiences of residents at 8 institutions that implemented the I-PASS bundle. While many residents (junior residents in particular) were supportive of I-PASS, the study identified several implementation challenges. Components of the bundle were differentially implemented. Fidelity to the use of the bundle was higher when residents were observed by faculty, with more complex patients, during evening handoffs, and in unfamiliar teams.

The authors used normalization process theory6  to relate resident experience with I-PASS implementation. Diffusion of innovation frameworks can offer similar insight into the implementation experience. Rogers7  has described 5 characteristics of innovation that facilitate sustainability and spread: trialability (the ease of testing a change before full implementation); relative advantage of a change comparted to current methods in use; compatibility with the current system; observability of the potential impact of change before general adoption; and simplicity. Evidence that a proposed change is effective supports successful uptake and spread,8  and is related to relative advantage and observability (when evidence is shared during implementation of a change). In the book Edgeware: Insights From Complexity Science for Health Care Leaders,9  Zimmerman and colleagues noted the importance of adaptability, identifying the few key facets of an intervention that must remain intact across settings for an intervention to work, while at the same time allowing those in different settings to modify other factors (“around the edges”) to adapt an innovation to their setting. Sponsorship from organizational leaders10  is also important, as organizational leaders supply incentive and resources necessary for initiating and sustaining change. However, without support from key front-line opinion leaders, solely relying on “top down” mandates for change may lead to individuals acting in “compliance mode,” without being committed to a change over longer periods of time. According to Rogers7  (along with other diffusion and complexity science theorists and popularized by Malcolm Gladwell in The Tipping Point11), identifying peers who are dissatisfied with the status quo and may be “attracted” to doing things differently, and who are then asked to make a change, can increase engagement in the change process. These “early adopters” help bring other individuals along in change implementation and dissemination. Eventually, a change gains enough support and momentum (“the tipping point”) where it becomes rapidly adopted by most others in the system.

One can imagine that many of the implementation issues identified by Coffey et al5  would be germane to implementation of I-PASS or other handoff bundles by practicing physicians in non–graduate medical education settings. What might these other frameworks add to the conclusions offered by Coffey et al,5  either for future implementation in a graduate medical education or a practice context? Delays in implementing I-PASS at some of the sites might have been eased by observability (teams from one setting viewing implementation at other settings), and promoting more resident involvement at the outset as early adopter champions. Early adopter opinion leaders might have helped ease resistance to, or resentment of, the I-PASS bundle by some senior residents, many of whom did not see the relative advantage of I-PASS compared to their usual methods of patient handoffs. Involvement of early adopter opinion leaders should be considered to help lessen potential skepticism by practicing physicians who might be asked to use the I-PASS bundle. In addition, given the difficulties of gestalt self-assessment,12  it is not clear if senior residents were objectively performing handoffs as well as they thought. More complete adherence to I-PASS during faculty observation leads one to wonder whether residents were acting in “compliance mode” rather than being committed to long-term implementation, especially given comments that some residents felt completeness was overemphasized compared to conciseness.

Auto-importation of data into the I-PASS tool facilitated simplicity, but printing of the longer document with multiple mandatory subfields and daily manual updates was viewed as adding complexity to the implementation. This was not entirely compatible with current workflows, resulting in deprioritization of the bundle when residents were presented with competing demands.

Strict adherence to the full I-PASS bundle might not be needed in all circumstances. A recent study in surgical settings found no difference in outcomes between a rigorous formal handoff approach and a shorter, more focused handoff process.13  Could “synthesis by receiver” be eliminated in some circumstances (as was often done) without adversely impacting patient outcomes? Allowing teams to determine the optimal circumstances for use of the full I-PASS tool (for more complex or unfamiliar patients, junior learners, and unfamiliar teams) speaks to trialability, relative advantage, and adaptability—as long as real-time feedback and evidence (observability) about the differential effects and outcomes of tailored compared with full implementation of the tool is gathered and presented to the teams on a regular basis. Such involvement could lead to improved compatibility of the tool with the practice environment, and foster commitment that could facilitate longer-term sustainability of I-PASS use.

Where is the “sweet spot” between adherence to the bundle and achieving optimal reduction in adverse events? Coffey and colleagues5  offer a valuable “look under the hood” of I-PASS bundle implementation. Future, larger, mixed methods studies stratifying patients by complexity could help determine if the added value of the full tool is greater for more complex patients. Additional studies could examine the value of I-PASS for different specialties, with practicing physicians, and in different practice contexts, settings, and circumstances. Additional studies will help further elucidate characteristics that facilitate adoption and help determine, in the realist evaluation framework, “what works for whom under what circumstances.”14 

1.
Riesenberg
LA,
Leitzsch
J,
Massucci
JL,
et al.
Residents' and attending physicians' handoffs: a systematic review of the literature
.
Acad Med
.
2009
;
84
(
12
):
1775
1787
.
2.
Foster
S,
Manser
T.
The effects of patient handoff characteristics on subsequent care: a systematic review and areas for future research
.
Acad Med
.
2012
;
87
(
8
):
1105
1124
.
3.
Starmer
AJ,
Sectish
TC,
Simon
DW,
et al.
Rates of medical errors and preventable adverse events among hospitalized children following implementation of a resident handoff bundle
.
JAMA
.
2013
;
310
(
21
):
2262
2270
.
4.
Starmer
AJ,
Spector
ND,
Srivastava
R,
et al.
Changes in medical errors after implementation of a handoff program
.
N Engl J Med
.
2014
;
371
(
19
):
1803
1812
.
5.
Coffey
M,
Thomson
K,
Li
SA,
et al.
Resident experiences with implementation of the I-PASS handoff bundle
.
J Grad Med Educ
.
2017
;
9
(
3
):
313
320
.
6.
May
CR,
Mair
F,
Finch
T,
et al.
Development of a theory of implementation and integration: normalization process theory
.
Implement Sci
.
2009
:
4
(
29
):
1
9
.
7.
Rogers
EM.
Diffusion of Innovations. 4th ed
.
New York, NY
:
Free Press;
1995
.
8.
Plsek
P.
Spreading good ideas for better health care: a practical tool kit. Tools, perspectives, and information for health care leaders
.
2000 Research Series, Volume 2
.
Irving, TX: VHA Inc;
2000
. .
9.
Zimmerman
B,
Lindberg
C,
Plsek
P.
Edgeware: Insights From Complexity Science for Health Care Leaders
.
Irving, TX
:
VHA Inc;
1998
.
10.
Marsh
B,
Price
D.
Innovation in Kaiser Permanente Colorado region: where we've been, where we are going
.
Perm J
.
2005
;
9
(
4
):
40
43
.
11.
Gladwell
M.
The Tipping Point: How Little Things Can Make a Big Difference
.
New York, NY
:
Little, Brown, and Company;
2000
.
12.
Davis
DA,
Mazmanian
PE,
Fordis
M,
et al.
Accuracy of physician self-assessment compared with observed measures of competence: a systematic review
.
JAMA
.
2006
;
296
(
9
):
1094
1102
.
13.
Clanton
J,
Gardner
A,
Subichin
M,
et al.
Patient Hand-Off iNitiation and Evaluation (PHONE) study: a randomized trial of patient handoff methods
.
Am J Surg
.
2017
;
213
(
2
):
299
306
.
14.
Pawson
R,
Tilley
N.
Realistic Evaluation
.
London, UK: SAGE Publications Ltd;
1997
. .