Previous research has found simulation with debriefing to be helpful in developing self-confidence, improving clinical competence, identifying knowledge deficits, and implementing knowledge into practice in the short term. However, the long-term implications of simulation curation and participation are unknown.
The purpose of this study was to evaluate the long-term effect of large-scale simulation curation and participation as part of an advanced-practice athletic training course.
Qualitative phenomenological approach.
From among 60 potential participants, 11 individuals participated in a long-term, follow-up interview to explore their recollections, perceptions, and subsequent experiences from curating and participation in large-scale simulation.
Deidentified transcripts were checked for accuracy and sent for member checking. Subsequently, a 3-person data analysis team used several sequenced rounds of review, aligned with consensual qualitative research, to the analyze data. Trustworthiness was established with member checking, multianalyst triangulation, and auditing.
Three domains emerged from the data: emotional reaction, improvements to practice, and the value of debriefing. When the emotional reaction domain was described, learners focused on the reality, overload, and need to maintain composure of the encounter. Within their clinical practice, improvements were made primarily regarding mindset, teaching, collaboration, emergency planning, and triage. Finally, learners noted the value of debriefing as humbling, a time for realized deficiency, and a time of emotional decompression.
Simulation-based learning in advanced-practice clinicians leads to perceived increase in skills such as intra- and interpersonal skills and emotional readiness. Long-term effects of simulation demonstrated that learners could translate these skills into clinical practice even 2 to 3 years post experience. Finally, the use of debriefing is a critical component to both the learner's skill acquisition and translation of knowledge in all simulation-based experiences.
Two to 3 years after engaging in a large-scale simulation experience, advanced practice clinicians still recalled their emotional reactions, identified areas of improvement in their practice, and spoke to the value of debriefing as part of the experience.
Large-scale simulation can stimulate real feelings including feeling overloaded and a need to maintain composure. Although in some disciplines creating this intense environment may not be warranted, a profession like athletic training, which requires competence in immediate and emergency care, needs high-intensity emergency simulations to replicate the potential experience for learners.
Participants described long-term improvements in their practice 2 to 3 years after the large-scale simulation, suggesting there is a lasting effect well beyond the immediate experience.
Debriefing is a critical component to simulation, and participants appreciated the opportunity to emotionally decompress and learn from their mistakes alongside classmates.
Participants were able to learn humility from being able to discuss their errors and plan for improvements through the debriefing process.
Throughout the years, educational theorists have contextualized learning around the creation of real-life scenarios and application to drive translation of knowledge and skills. Experiential learning theory, first described by Kolb in 1984,1 emphasizes transformation because of an experience. Within this theory, knowledge is not only gained through the transformative experience but also contextualized through reflection.1 In 1910, John Dewey first described reflective learning theory, which is the process of recalling the learning experience and posing questions that explore why the outcome resulted as it did and if other actions would have yielded a different outcome.2 Reflection is a critical stage of experiential learning and allows learners to create new meaning and test new hypotheses in future similar situations. In health care education, experiential learning theory has been seen across the professions using real-life clinical education as well as through simulation.3–6 Most of the previous literature in health care education has focused on the short-term effects of experiential learning, but health care professions rely heavily on the theoretical benefits of experiential learning to develop high-quality providers in the long term.
The use of simulation, a learning experience that mimics real life, has continued to grow in popularity in health care over the last 100 years.7 Previous research on the short-term effects of simulation have demonstrated positive outcomes and growth in learners including knowledge, skills, self-confidence, anxiety and comfort level, and the cooperation with and between health care students.7 Additionally, research has supported the concept that there are equal benefits in the short term for both those in the role of learner and observer in a simulation.8 The notion of both the learner and observer has also revealed skill transfer into that of clinical education or work settings.8,9
The long-term effects of simulation are largely unknown, but 2 recent qualitative studies in athletic training, one at the professional level and another at the advanced, postcredential level, have identified benefits. Professional-level learners with 1 to 6 years of practice experience, following completion of their educational program, described the personal and professional growth and development gained as a result of having multiple standardized patient encounters, a form of simulation where an actor is trained to portray a patient scenario.10 Specifically, these participants described personal development in communication, patient rapport, and self-reflection.10 Additionally, they indicated that the standardized patient encounters provided them with realistic patient experiences, thereby aiding in transition to practice and enabling them to practice making clinical decision and engage in patient-centered care.10 In a study of advanced-practice athletic trainers, learners reported translating new knowledge and ways of thinking into practice as well as enhancing self-reflective practice 60 days after a standardized patient encounter.11
What is known about the effects of simulation curation is also somewhat limited. In occupational therapy education, co-constructed simulations with instructors serving as facilitators yielded insights into self-regulated learning as a result of serving multiple roles, interacting with the instructor and peers, and multistage development.12 A previous study13 described the educational technique of student-curated large-scale simulation in advanced athletic training practice and hypothesized its association with experiential and reflective learning theories but have yet to characterize the effect. Therefore, the purpose of this study was to evaluate the long-term effect of large-scale simulation curation and participation as part of an advanced-practice athletic training course.
We used a phenomenological approach to evaluate the lived experiences of advance-practice learners who engaged in and curated a large-scale simulation, and we used the Standards for Reporting Qualitative Research, as is best practice, to guide the methodological approach to the research design.14 The study obtained approval from the Indiana State University Institutional Review Board and included audio-recorded semistructured interviews.
Participants were included in the study if they had graduated from the Indiana State University Doctor of Athletic Training Program and had completed the simulation experience at least 1 year before the scheduled interviews. All learners had completed the simulation curation and participation as part of the mandatory curricula for their education. The curation of the simulation learning experience as part of the program is described in detail as a 3-part modular series13 leading up to the large-scale simulation (Figure 1). In total, 3 cohorts of students (N = 60) were eligible to participate in this study, those with at least 2 years since the large-scale simulation experience. Graduates were contacted via email (1 initial email, 2 follow-up emails in Fall 2019) to determine their interest in participation. Nine participants volunteered in the fall recruitment period, but we again sought additional volunteers in Spring 2020 to ensure data saturation had occurred. A total of 11 people participated in this study (Table 1).
Members of the research team created the interview script and engaged 3 expert reviewers in simulation curation (n = 2) and qualitative research (n = 1) to evaluate the script. Minor editorial changes and sequencing of questions were made as a result of the external review. One member of the research team practiced the interviews 3 times before conducting all the interviews. The semiscripted final interview script comprised 8 questions with subsequent follow-up questions as appropriate (Table 2). There were no iterative changes made to the interview script during data collection.
Data Collection Procedures
One member of the research team (K.G.) conducted each of the audio-only interviews by Zoom meeting (Zoom Video Communications Inc) between November 2019 and March 2020. At the onset of the interview, participants provided verbal informed consent, age, years of experience, gender, and time since graduation. The interviewer then continued with the 8-question semistructured interview (Table 2). Throughout each interview, the interviewer engaged in reflexivity by jotting notes to address their own biases and assumptions,14 specifically to acknowledge that the learning activity could yield both benefits and strain on the learners. Interviews ranged from 13 to 36 minutes (average, 26 ± 8 minutes). At the completion of the interview, the audio files and transcripts were downloaded. The transcriptions were deidentified by the interviewer and saved to a secured cloud server. This ensured participant protection as well as an unbiased data analysis process, as a member of the coding team was also the instructor of record. Participants were sent their transcript to verify the accuracy of the transcript.
We used the consensual qualitative research (CQR) method to analyze the interviews in this study. Three members of the research team (L.E.E., E.R.N., K.G.) completed the coding process (Figure 2). In following the CQR tradition, the members of the coding team completed phase 1 by each reviewing 4 transcripts individually while taking notes on common core ideas throughout the interviews. The coding team then met, discussed the findings, and created the preliminary codebook. For phase 2, the research team took 2 transcripts from the first phase and 2 new transcripts to review the validity of the preliminary codebook. The team met and made minor revisions to the codebook, thus creating the consensus codebook. In phase 3a, each member of the coding team independently coded 4 unique transcripts for the domains and categories within the consensus codebook. The team then exchanged transcripts for phase 3b, to review the previous coding.
A consensus meeting was planned where all codes were confirmed with at least two-thirds agreement. We used an external auditor (Z.K.W.) to review the consensus codebook against a sample of transcripts. Categories were assigned as general if identified in 11 cases, typical if identified in 6 to 10 cases, variant if identified in 3 to 5 cases, and rare if identified in 2 or fewer cases. To address research reflexivity, the research team used the CQR process to minimize biases that may have emerged from their perspectives on simulation-based learning.14 The research team also executed other reflexive qualitative behaviors such as member checking, multianalyst triangulation, and limited biases between interviewer and interviewee by ensuring no previous relationship (teaching, facilitating the simulation, or otherwise).14 Deidentified excerpts from the participant's interviews are presented to verify the data.
Three domains emerged from the data (Figure 3): (1) emotional reaction, (2) improvements to practice, and (3) value of debriefing. These domains were further characterized by categories detailed in Table 3.
Participants described having an immediate emotional reaction to the simulation experience (Table 4). They attributed that reality to how they tried to make sure it was real for their classmates, which was reciprocated in the simulations they engaged in as learners. On occasion, participants indicated that the simulations were less realistic when they recognized the actors or their classmates or were assigned duties that did not align with their skillsets, which was avoided as much as possible. The simulations were planned consecutively over the course of the day, and as such, participants described feelings of both cognitive and emotional overload. Although rarely did a student participate in more than 1 simulation as a learner, and they only delivered 1 simulation of their own, they often described an emotional energy drain at the conclusion of the learning activity. Some of this overload was described as a means of reinforcing the connections the participant had during the debriefing experiences. During the simulations, the participants were pleased when they were able to maintain composure but acknowledged how difficult that was when there were multiple patients and exacerbating environmental factors that enhanced the fidelity of the simulations.
Improvements and professional development were a direct result of the simulation curation and participation. Participants noted a mindset shift. Some were able to see the immediate effect, as their classmates were able to immediately implement knowledge, skills, and abilities that revealed themselves as deficiencies in the previous simulation debriefing. In addition, participants described a direct translation of the activity as being emotionally prepared for future incidents, specifically when preparing for death and dying within their practice. From a skill perspective, participants found themselves improved at triaging patients in an emergency scenario. Logistically, participants noted that the experience prepared them for emergency planning. Some noted that after the experience, they saw simulation as a way to assess the deficits in their teams and plan for improvements. Interestingly, collaboration through effective communication was critical to success in these chaotic emergency simulations; however, only half of the participants identified collaboration and communication as a continued improvement in their practice. One participant could see that curating the simulation in teams also had value. Specifically, the simulation activities prepared them to use those skills to teach others, both in a practical and an academic environment.
Participants collectively indicated that the most effective parts of the experience occurred during each debrief. The debriefing was humbling for all participants, where they described having this critical time for reflection. Many felt the debriefing brought them closer to their classmates. The debriefing allowed the participants to be vulnerable and verbally recognize their deficiencies, while also supporting their classmates when they recognized their own errors. This process had a lasting effect on some participants as they were able to replicate this kind of experience when teaching others. Participants appreciated the debriefing experiences because they allowed them time and space to emotionally decompress and prepare for the future in a safe space.
Our findings suggest that large-scale simulation can lead to immediate and long-term learning benefits for practice, and moreover that reflection was likely the most effective component for creating change in future practice. Debriefing after a simulation-based learning experience helped students process what they had just experienced, discuss skills and concepts that were deficient, and build a sense of community. Participants also discussed how these experiences allowed them to be prepared for future similar situations and share this knowledge with others.
Experiential and Reflective Learning
Experiential learning is an active learning process by which the learner constructs new knowledge by linking new experiences with previous knowledge. Simulation-based learning is an extension of this pedagogical method and is used to “replace or amplify real experience with guided experiences.”15 Literature suggests that experiential learning is more likely to produce behavior and attitude changes than didactic modes.16 Our participants discussed many ways that these experiences changed their patient care, pre-event rituals, emotional readiness for emergent situations, and teamwork.
One key element that enhances experiential learning is fidelity. Fidelity means creating an experience for the learner that is believable and reflects reality.17 Fidelity is important when the learning outcome is to translate knowledge from the created experience into clinical practice.18 Our participants described that fidelity was high in most simulations, creating an urgency to act and an emotional reaction to the scenario. Participants also indicated that fidelity diminished if they recognized the actors or the actors had a different skillset than what the learners knew them to have. Low-fidelity simulation may increase skill development immediately, but long-term skill retention is lower.18 The participants in our study were able to articulate knowledge, skills, and abilities they learned directly from the curation and participation in large-scale simulation from 2 or more years after the experience. This finding supports that high-fidelity simulations do have long-term effect on clinical practice.
Simulation-based learning is ideal for providing hands-on, risk-free patient encounters to teach specific skills that students are likely to experience in the real world. Our participants discussed both the short- and long-term effects on clinical practice as a result of this curation and participation in these simulations. Short-term effects of simulation are largely skill-based improvements, including gaining technical skills, knowledge, self-confidence, and clinical competency.7,19 Similarly, our participants articulated skills and abilities they learned immediately and could transfer to the next simulation on that same day. Skill transfer has also been identified as a direct benefit after a simulation experience; specifically, anesthesia trainees performed better after simulation than participants receiving interactive seminars alone.9 Our participants stated that they could use skills immediately in their clinical practice, but most of the skills they discussed related to long-term effects on their clinical practice.
The long-term effects identified by participants included improved interpersonal skills, health care competency, and emotional readiness. Simulation-based education can help prevent skill decay.10,20–23 Specifically, some learners have noted that some skill decay has happened, but that their end point was still better than baseline prior to the simulation.10 After being exposed to simulation-based learning in a professional athletic training degree program (precertification), those learners spoke to the personal and professional development they gained from engaging in simulation. The findings from this study,10 as well as our current findings, as well as our current findings are novel within the health care literature. Previous work on long-term simulation effects has primarily focused on skill performance and retention, while our participants noted the long-term benefits in more soft skills. Previous literature found “professionalism” was among the only soft skills retained in the long term as a result of simulation.10,24 Our participants not only spoke about emotional readiness for an emergent situation but also described being prepared because they were able to recall and reflect on previous poor performance, certainly a characteristic of professionalism. In our study, the skill acquisition and retention appeared to be with the decision-making itself in those difficult situations, and less so on the specific health care delivery. This may be a result of the kind of learner, as most of the previous research has focused on professional-level learners.
Debriefing was described by our participants as one of the most important parts of the learning experience. This peer-led communal discussion allowed them to reflect on the events that took place in relation to the learning objectives and critically assess their performance as a health care provider as well as their performance as collaborating clinicians. Participants described the debrief as a learning environment that was mutually beneficial for their professional growth and improved the relationships with their peers. They expressed how hearing their peers acknowledge errors led to their own willingness to be vulnerable. Effective debriefing can lead to enhanced learner performance and specific clinical skill acquisition.25,26 The literature suggests that most of the learning during simulation came from either the debrief or the reflective learning.27 Previous research notes that debriefing can lead to a closer relationship between facilitators and participants of the debrief,28 and in the case of our participants, the peer-led debriefing created closer relationships to those peers.
Although the experience itself is the catalyst for future learning, constructive and meaningful reflection is the fuel for future change in performance. In experiential learning, Kolb theorized that after reflection, learners could construct new hypotheses and eventually test those in similar situations.2 The participants in our study had multiple simulations throughout the day and could use the hypotheses created from the previous experiences and immediately apply them to the next. Exposure to multiple high-fidelity simulations has led to students feeling more self-confident in their skills, improving clinical competence, identifying knowledge deficits, and implementing knowledge into practice in the short term.29 However, in testing conditions, such as an objective structured clinical examination (OSCE) experience, with self-reflection instead of facilitated or group debriefing, student performance did not improve between encounters.30 Effective debriefing requires participants who are willing to be self-aware, open to feedback, communicative, and vulnerable.28 Dialogue that promotes an exchange between learners can result in long-term benefits to clinical practice, ultimately contributing to learning. Because of the importance of the debriefing and reflective piece, it is essential that facilitators, including peer-to-peer facilitators, are well versed in best practices of leading the debriefing process.12,13,31
Experiential learning and reflection are focused heavily on the individual learner, and the theories often do not account for the social nature of the learning experience. Social Constructivism is an active learning theory by which learning occurs through experience but is contextualized by social influences.32 The key assumption of the theory is that individuals learn from one another and support each other by championing their own skills and abilities to help support other learner's academic achievements.32 Research has suggested that for debriefing to be effective, the learners must be judgment-free so they can be appropriately reflective and willing to share their experiences.33 This judgment-free learning environment minimizes fear and distress and allows students to become more reflective and likely to share their experiences.33 Our participants described that their peers were influential in their emotional state throughout the day. They described learning from each other's mistakes, celebrating each other's successes, and discussing ways to improve, while acknowledging these kinds of experiences are likely to yield an emotional response.
Participants in this study were all enrolled and graduated from the same doctoral program with which the researchers are all affiliated. While their opinions and testimony are all uniquely their own, their responses could have been biased based. In addition, the experience curating, implementing, and reflecting on the large-scale simulation may not be generalizable to other clinicians and/or those engaging in continual professional development without a structured doctoral program to guide the process. The results suggest promising evidence that simulation and reflection may lead to long-term benefits in clinical practice. Therefore, we suggest that future research be completed using a similar structure in other health care programs and throughout continuing medical education as a form of professional development to explore the active learning method of simulation.
At least 3 years after curating and participating in a large-scale simulation experience, advanced-practice clinicians still recalled their emotional reactions, identified areas of improvement in their practice, and spoke to the value of debriefing as part of the experience. Large-scale simulation can stimulate real feelings that include feeling overloaded and a need to maintain composure. Although in some disciplines, creating this intense environment may not be warranted, a profession like athletic training, which requires competence in immediate and emergency care, needs high-fidelity emergency simulations to replicate the potential experience for learners. Participants described long-term improvements to their practice, suggesting there is a lasting effect of simulation well beyond the immediate experience. Debriefing is a critical component to simulation, and participants appreciated the opportunity to emotionally decompress and learn from their mistakes alongside classmates. Participants were able to learn humility from being able to discuss their errors and plan for improvements through the debriefing process.