Background

The Accreditation Council for Graduate Medical Education Common Program Requirements require residents to participate in real or simulated interprofessional patient safety activities. Root cause analysis (RCA) is widely used to respond to patient safety events; however, residents may lack knowledge about the process.

Objective

To improve clinicians' knowledge of the tools used to conduct an RCA and the science behind them, and to describe this course and discuss outcomes and feasibility.

Methods

A flipped classroom approach was used. Participants completed 5 hours of pre-course work then attended an 8.5-hour program including didactic sessions and small group, facilitator-led RCA simulations. Pre- and post-surveys, as well as a 10-month follow-up on knowledge of and comfort with the RCA process were compared. Statistical significance was evaluated for matched pairs using a repeated measures analysis of variance.

Results

Of 162 participants trained, 59 were residents/fellows from 23 graduate medical education programs. Response rates were 96.9% (157 of 162) for pre-course, 92.6% (150 of 162) for post-course, and 81.5% (132 of 162) for 10-month follow-up survey. Most participants had never participated in an RCA (57%, 89 of 157) and had no prior training (87%, 136 of 157). Following the course, participants reported improved confidence in their ability to interview and participate in an RCA (P<.001, 95% CI 4.4-4.6). This persisted 10 months later (P<.001, 95% CI 4.2-4.4), most prominently among residents/fellows who had the highest rate (38.9%, 23 of 59) of participation in real-world RCAs following the training.

Conclusions

The course led to a sustained improvement in confidence participating in RCAs, especially among residents and fellows.

Objectives

The goal of the project was to improve health care workers' knowledge of and comfort with participating in the root cause analysis (RCA) process through development of a 1-day simulation-based RCA course.

Findings

Participants' self-reported knowledge of and comfort with the RCA process improved following attendance at the course with the greatest improvement reported by residents and fellows.

Limitations

While participants' self-reported knowledge and confidence improved following course attendance, no objective evidence of competence in RCA participation was collected.

Bottom Line

Implementation of a single day, simulation-based, interprofessional RCA course is a feasible means to improve self-reported knowledge of and comfort with the RCA process, especially among residents and fellows.

Patient safety events resulting from medical errors interfere with the delivery of safe and high-quality health care, negatively affect physician well-being, and represent the third leading cause of death in the United States.1-3  It appears that historically few graduate medical education (GME) programs have incorporated formal training regarding medical errors into their curricula. Root cause analysis (RCA), recognized as a powerful tool for achieving safer health care, has been increasingly used as an educational method for residents.4 

In 2017, the Accreditation Council for Graduate Medical Education updated their Common Program Requirements to include that “residents must participate as team members in real and/or simulated interprofessional clinical patient safety activities, such as root cause analyses.”5  However, participating effectively in an RCA may require intentional training or prior experience. Most facilities lack standardized RCA processes and have insufficient opportunities for all residents to paricipate.6 

Patient safety literature suggests that simulation is useful for teaching complex skills, including communicating effectively within multidisciplinary care teams,6  and for improving participants' familiarity with the RCA process.7  Some facilities have developed mock RCAs to train residents, leading to increased confidence in RCA participation.7  However, these mock RCAs have been small in scale and involve only one department.

Our institution developed an RCA training program using didactic lectures and small group activities to replicate an RCA using a real sentinel event case. The framework was built on RCA2  (RCA Squared) guidelines published by the National Patient Safety Foundation in 20168 and RCA Tools from the VHA National Center for Patient Safety.9  Our objective was to use simulation to improve RCA knowledge and confidence for a diverse interprofessional group of participants, while simultaneously increasing the pool of trained personnel, particularly within GME, to participate in RCAs. We describe the development and implementation of our course, RCA W3 (What happened? What should have happened? What are you going to do to fix it?), and discuss the outcomes and feasibility to encourage adoption of similar programs elsewhere.

Eligible participants for this voluntary course during academic years 2018-2019 (AY18) and 2019-2020 (AY19) included all residents and fellows, as well as GME faculty and non-physicians interested in patient safety. Participants were from all departments at our institution, an urban 425-bed military health care facility with 36 total GME and graduate allied health programs. The course was advertised across our health care market and during Graduate Medical Education Council meetings to encourage programs to support resident attendance. The course utilized a flipped classroom approach.

Participants attended an 8.5-hour in-person course, for which continuing medical education credit was available. Content taught by patient safety leaders included didactic sessions on fundamentals of patient safety science, RCA principles, fact gathering, identifying causal factors, and developing corrective action plans. Didactic sessions were followed by facilitator-led small groups (see online supplementary data). There were 12 (AY18) and 17 (AY19) small groups of 5 to 6 members with 1 to 2 facilitators per group. A real sentinel event from the institution was used to ensure high fidelity in the simulated environment. Course participants practiced interviewing with 10 volunteer “actors” portraying the roles of the personnel involved in the sentinel event. Each had access to the redacted medical record and personal statements of the individuals interviewed during the real-world sentinel event. Volunteer facilitators included trained patient safety specialists and clinical staff with experience leading an RCA. All facilitators attended a 1-hour train-the-trainer session prior to the course.

Each small group simulation experience was followed by a large group where findings were presented and discussed. Redacted RCA components from the case were released sequentially to allow participants to compare their ideas to those generated during the actual RCA. The large group sessions were facilitated by the course director (R.I.M.) as per the course agenda (online supplementary data).

Prior to the course, participants completed an 11-item pre-course survey, which collected data on prior participation or education about the RCA process (online supplementary data). All surveys used for the course were developed by the authors and no validity data was collected prior to their use. Four questions used a Likert scale to explore participant comfort interviewing for an RCA and confidence in participating or leading an RCA. After initial survey submission, participants completed 5 hours of pre-course content focusing on knowledge to maximize meaningful engagement (online supplementary data).

Following pre-course work completion, a 27-item pre-course assessment evaluated knowledge of topics covered in the pre-course material. The pre-course assignments, evaluations, and redacted charts were released iteratively, requiring participants to finish one task before gaining access to the next.

Immediately following the course, and again 9 to 10 months after the course, participants completed a post-course survey and follow-up survey to examine confidence levels with the RCA process (online supplementary data). All surveys contained questions 7 to 10 of the follow-up survey (confidence participating in an RCA, leading an RCA, interviewing people involved in sentinel events, and RCAs leading to patient safety improvements). Results of these surveys were reported as percentages and as means (confidence intervals) based on a standard Likert scale of 1 to 5 (Table 1). Likert scale data was analyzed in a repeated measures analysis of variance comparing any difference in the initial 3 surveys collected: pre-course survey, pre-course assessment, and post-course survey, then in comparison to the 10-month follow-up results. Additionally, the authors collected information on subsequent participation of course attendees in real-world RCAs by reviewing RCA attendance records, which are maintained by hospital patient safety, at 6-month intervals following implementation of the course, an ongoing practice at our institution to determine the level of GME participation in patient safety at the institutional level. SurveyMonkey was used for surveys; statistics were performed using JMP 13.2 (SAS Institute Inc., Cary, North Carolina).

Institutional Review Board screening determined the study to be exempt as non-research.

In AY18 and AY19, 162 course participants were trained, including 59 (36.4%) residents and fellows from 23 different GME programs. Response rates were 96.9% (157 of 162) for pre-course survey, 92.6% (150 of 162) for pre-course assessment, 97.5% (158 of 162) for post-course survey, and 81.5% (132 of 162) for follow-up surveys (Table 1). The pre-course survey was used to establish participants' baseline experiences and formal education about RCAs. It demonstrated that most course attendees had never participated in an RCA and had no prior training in conducting an RCA. Additionally, the majority had no prior exposure to basic tools used in the RCA process, including RCA2 : Improving Root Cause Analyses and Actions to Prevent Harm, the VA Corrective Action Hierarchy, and Safety Assessment Codes (Table 1).

A comparison of the 4 questions: confidence participating in an RCA, confidence that an institutional RCA will lead to improved patient safety, comfort interviewing staff members involved in a sentinel event, and confidence leading an RCA showed statistically significant improvements in all but confidence in institutional RCAs when comparing the pre-course survey to the post-course and follow-up surveys (Table 2). The follow-up survey demonstrated a slight decline in confidence compared to immediately after the course; however, confidence remained higher than before the in-person course (statistically significant compared to before the pre-course survey). Confidence in RCAs leading to improved patient safety were not significant. Otherwise, respondents' values for this question remained constant without any statistically significant changes. The greatest improvements in confidence for participation, interviewing, and leading an RCA were by residents, followed by faculty across all categories, which was sustained on the follow-up survey. In particular, nurses and “other” personnel only reported a statistically significant improvement in confidence leading an RCA immediately following the pre-course assessment and at the follow-up survey.

RCA participation collected from RCA attendance records demonstrated that 90% (9 of 10) of institutional RCAs had resident/fellow team members in AY19 and 92.3% (12 of 13) in AY20, compared to 63% (5 of 8) in the year preceding course implementation. In the 3 years following course implementation, 45 residents/fellows participated in institutional RCAs, of whom 32 were course graduates. As of April 2022, 44.0% (26 of 59) residents/fellows who attended the course have participated in institutional RCAs, higher than any other professional group of attendees.

The RCA W3 course is a feasible model for implementation at other institutions. It requires minimal funding but does need significant person-hours to complete the lengthy process of identifying an appropriate case for RCA simulation, redacting medical records, recruiting and training course facilitators (of which more than 30 hours were needed to allow for small group ratios of 1-2 facilitators:5-6 participants), and the logistical demands of finding appropriate space. Subsequent iterations of the course required substantially less person-hours because redacted case materials were completed and the facilitator pool was established. Prior course participants who had subsequent institutional RCA experience were also recruited as facilitators.

A total of 162 interprofessionals from 48 departments participated in the RCA W3 courses. Participants improved and retained confidence in their ability to conduct an RCA. The course helped to create a pool of RCA-trained residents/fellows for institutional RCAs. The resident/fellow group had the greatest improvement in confidence for participation, interviewing, and leading an RCA, which may be secondary to the higher rate of RCA participation in this group. The pool of residents/fellows and faculty with additional RCA training now helps to ensure GME participation for future RCAs.

The course design is informed by experiential learning theory as proposed by Kolb and Kolb.10  By using small group simulation, discussion, and feedback, the course allows for concrete experience followed by reflection discussions that promote the application of the new perspectives and ideas gained. These are critical when considering the 4 principles of andragogy (adult education) from Knowles et al.11  Additionally, while prior studies have examined the use of simulation to teach RCA competencies to trainees, these interventions were not interprofessional.12,13  The use of multidisciplinary small groups and selection of a sentinel event case promotes interprofessional collaboration, encourages emotional/experiential learning, and creates an environment of higher fidelity in comparison to resident-only simulated interventions.14,15  Additionally, it capitalizes on the capacity of simulation to effectively practice communication within a multidisciplinary team.6 

Previous studies demonstrate that resident practice patterns reflect those of their training institution, which remains true years later, suggesting that professional development is highly influenced by exposures during GME training.16,17  This is of particular significance in the military health system where we retain 100% of our GME graduates.

Compared to other RCA training platforms, this course was developed as a 1-day model to facilitate resident and faculty participation with minimal competition with occupational and program demands. Following course completion, we learned that participation in institutional RCAs was often challenging for residents due to the short lead time, which was mitigated through development of a computer-based “RCA On-Call Calendar” so participants could identify their availability in advance. There was a nonsignificant increase in the percentage of RCAs with resident/fellow participation following the course.

The study has several limitations. While the residents and fellows who participated had the greatest retention in confidence regarding RCA participation, it is unknown whether this increase was due to course participation or the higher rate of institutional RCA participation. Data collected focused on participants' self-perceived knowledge and confidence (a relatively weak outcome measure) and did not include objective assessment of competence performing an RCA or in retention of RCA knowledge at follow-up. Finally, the response rate for the follow-up survey was 81.5%, which was lower than for the other 3 surveys.

Despite these limitations, this initiative benefits from its potential for expansion in number of courses, participants, or exportability to other facilities. Course development and material assembly has been accomplished, thus the work involved for future iterations has substantially decreased. Our hope is to explore virtual formats and to increase the course frequency as there was notable increase in popularity from one year to the next with positive feedback.

The interprofessional curriculum designed to train participants to conduct an RCA in response to a patient safety event demonstrated a sustained increase in confidence and participation in institutional RCAs, especially among residents.

1. 
Kohn
LT,
Corrigan
J,
Donaldson
MS.
To Err Is Human: Building a Safer Health System
.
Washington, DC
:
National Academy Press;
2000
.
2. 
Makary
MA,
Daniel
M.
Medical error—the third leading cause of death in the US. BMJ.
2016
;
353;i2139.
3. 
Waterman
AD,
Garbutt
J,
Hazel
E,
et al
The emotional impact of medical errors on practicing physicians in the United States and Canada
.
Jt Comm J Qual Patient Saf
.
2007
;
33
(8)
:
467
-
476
.
4. 
Cady
RF.
Becoming a high reliability organization-operational advice for hospital leaders report
.
JONAS Healthc Law Ethics Regul
.
2008
;
10
(2)
:
33
.
5. 
Accreditation Council for Graduate Medical Education.
Common Core Requirements. Accessed November 20,
2018
.
6. 
Beaubien
JM,
Baker
DP.
The use of simulation for training teamwork skills in healthcare: how low can you go?
BMJ Qual Saf
.
2004
;
13
:
51
-
56
.
7. 
Murphy
M,
Duff
J,
Whitney
J,
Canales
B,
Markham
MJ,
Close
J.
Implementation of a mock root cause analysis to provide simulated patient safety training
.
BMJ Open Qual
.
2017
;
6
:
e000096
.
8. 
National Patient Safety Foundation.
RCA2: Improving Root Cause Analyses and Actions to Prevent Harm. Accessed April 1
,
2022
.
9. 
U.S. Department of Veterans Affairs.
VHA National Center for Patient Safety. Root Cause Analysis. Accessed April 1,
2022
.
10. 
Kolb
AY,
Kolb
DA.
Learning styles and learning spaces: enhancing experiential learning in higher education
.
Acad Manag Learn Educ
.
2005
;
4
(2)
:
193
-
212
.
11. 
Knowles
M,
Holton
E
Swanson
R.
The Adult Learner. 5th ed
.
Woburn, MA
:
Butterworth-Heinemann;
1998
.
12. 
Simms
E,
Slakey
D,
Garstka
M,
Tersigni
S,
Korndorffer
J.
Can simulation improve the traditional method of root cause analysis: a preliminary investigation
.
Surgery
.
2012
;
152
(3)
:
489
-
497
.
13. 
Quraishi
S,
Kimatian
S,
Murray
WB,
Sinz
E.
High-fidelity simulation as an experiential model for teaching root cause analysis
.
J Grad Med Educ
.
2011
;
3
(4)
:
529
-
534
.
14. 
Rudolph
J,
Raemer
D,
Simon
R.
Establishing a safe container for learning in simulation: the role of the pre-simulation briefing
.
Simul Healthc
.
2014
;
9
(6)
:
339
-
349
.
15. 
Guraya
S,
Barr
H.
The effectiveness of interprofessional education in healthcare: a systematic review and meta-analysis
.
Kaohsiung J Med Sci
.
2018
;
34
(3)
:
160
-
165
.
16. 
Chen
C,
Petterson
S,
Phillips
R,
Bazemore
A,
Mullan
F.
Spending patterns in region of residency training and subsequent expenditures for care provided by practicing physicians for Medicare beneficiaries
.
JAMA
.
2014
;
312
(22)
:
2385
-
2393
.
17. 
Asch
DA,
Nicholson
S,
Srinivas
S,
Herrin
J,
Epstein
A.
Evaluating obstetrical residency programs using patient outcomes
.
JAMA
.
2009
;
302
(12)
:
1277
-
1283
.

Author notes

Editor's Note: The online version of this article contains course content and learning objectives, course agenda, and the pre-course and follow-up surveys used in the study.

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

Disclaimer: The view(s) expressed herein are those of the author(s) and do not reflect the official policy or position of Brooke Army Medical Center, the US Army Medical Department, the US Army Office of the Surgeon General, the Department of the Army, the Department of the Air Force and Department of Defense, or the US Government.

An earlier version of this project was previously presented as a poster at the Accreditation Council for Graduate Medical Education Annual Educational Conference, Orlando, Florida, March 7-10, 2019.

Supplementary data