Abstract
Over the past decade, regulatory bodies have heightened their emphasis on health care quality and safety. Education of physicians is a priority in this effort, with the Accreditation Council for Graduate Medical Education requiring that trainees attain competence in practice-based learning and improvement and systems-based practice. To date, several studies about the use of resident education related to quality and safety have been published, but no comprehensive interdisciplinary curricula seem to exist. Effective, formal, comprehensive cross-disciplinary resident training in quality and patient safety appear to be a vital need.
To address the need for comprehensive resident training in quality and patient safety, we developed and assessed a formal standardized cross-disciplinary curriculum entitled Quality Education and Safe Systems Training (QuESST). The curriculum was offered to first-year residents in a large urban medical center. Preintervention and postintervention assessments and participant perception surveys evaluated the effectiveness and educational value of QuESST.
A total of 138 first-year medical and pharmacy residents participated in the QuESST course. Paired analysis of preintervention and postintervention assessments showed significant improvement in participants' knowledge of quality and patient safety. Participants' perceptions about the value of the curriculum were favorable, as evidenced by a mean response of 1.8 on a scale of 1 (strongly agree) to 5 (strongly disagree) that the course should be taught to subsequent residency classes.
QuESST is an effective comprehensive quality curriculum for residents. Based on these findings, our institution has made QuESST mandatory for all future first-year resident cohorts. Other institutions should explore the value of QuESST or a similar curriculum for enhancing resident competence in quality and patient safety.
Background
In 1999, the Institute of Medicine issued its landmark document To Err Is Human. It outlined a crisis in American health care and revealed that almost 100 000 annual deaths were attributable to medical error in a system “detrimental” to safe patient care.1 During the subsequent decade, much work has been done to improve health care quality and safety, led by such organizations as the National Quality Forum, the Institute for Healthcare Improvement, the Agency for Healthcare Quality Research, and the Joint Commission. While much of this work has focused on systems improvements in the delivery of care, the education of practitioners has also been recognized as a priority.2,3 In the area of resident education, the Accreditation Council for Graduate Medical Education requires programs to ensure that residents receive training in quality and safety through competency in systems-based practice and practice-based learning and improvement.4 To date, a few focused curricula in quality improvement have begun to appear; however, these tend to target specific quality issues such as hand hygiene and handoff communication or are designed for specific residencies or ambulatory settings.5–14 Consequently, most residents still must acquire most of their patient safety knowledge through informal education in the hospital setting because formalized comprehensive training curricula in quality and safety are, at best, uncommon.
Recognizing the need for a comprehensive systemwide cross-disciplinary curriculum for quality and patient safety and finding a lack of available standardized curricula in the literature, a team of leaders in quality and education at the Detroit Medical Center and Wayne State University School of Medicine collaborated to create such a curriculum. The design, implementation, and assessment of the resultant curriculum, Quality Education and Safe Systems Training (QuESST), are described herein.
Methods
A multidisciplinary team made up of hospital and system-quality officers and graduate medical education (GME) officers with collective expertise in quality and patient safety, GME, and instructional design and assessment collaborated to create the QuESST curriculum. The curriculum covered the following broad topics: introduction to quality and safety, communication and teamwork, handoff communication, error identification, analysis and disclosure, and human factors engineering. The curriculum incorporated multiple educational modalities, including didactic sessions, large-group facilitated discussions, and small-group breakout sessions table 1. table 1 provides a summary of the QuESST curriculum. The team designed the QuESST curriculum specifically for first-year residents across all the medical residencies at the Detroit Medical Center, as well as the pharmacy and dental residencies. The curriculum is also appropriate for residents in later years of training.
Delivery of the QuESST curriculum required a 4-hour session and was offered on 4 occasions. The GME leadership mandated that all first-year residents attend 1 of the sessions. (In addition, we offered the curriculum to the pharmacy and dental residency programs because of its potential appropriateness for their education. Their leadership agreed with its potential benefit and mandated that their first-year residents attend as well.) Two of 4 sessions occurred on weekday evenings, and 2 occurred on Saturday mornings to accommodate varied schedules across specialties. The 4 sessions were identical in content; however, after gaining experience from the first session, the authors made some minor structural adjustments to increase participant interaction during didactics and to add small-group educational components in the subsequent sessions.
To formally evaluate the QuESST curriculum, the team implemented a quantitative quasiexperimental study design. At the beginning of each session, participants were asked to complete a background and demographic survey and an 8-question multiple-choice or true-false preintervention general quality assessment (pretest) designed to evaluate participants' knowledge of quality and patient safety. The 8 questions covered the main topics and objectives of the curriculum. While attendance at the QuESST course was mandatory, participants were informed that completion of the survey and the pretest was voluntary and anonymous. To maintain anonymity but to allow for pairing of preintervention and postintervention assessment data, participants were asked to provide the following information at the pretest: (1) “first letter of your mother's maiden name,” (2)“number of the month of your birthday,” and (3) “the first 2 numbers of your home address.”
Following the intervention, participants were asked to complete the identical 8-question general quality knowledge assessment again (posttest) and a course evaluation. Participation in the posttest and course evaluation was again voluntary and anonymous. The same pairing-identifier data were obtained at the posttest as at the pretest to allow for pairing of data. The course evaluation did not include any identifier questions. The course evaluation requested that participants indicate their level of agreement to 12 statements on a 5-point sliding numerical scale in which 1 represented “strongly agree” and 5 represented “strongly disagree.”
All participant responses were collected and included in calculations. The data were analyzed using Microsoft Excel (Microsoft Corporation, Redmond, CA) for summary purposes and Statistical Analysis Software (SAS) version 9.2 (SAS Institute Inc, Cary, NC) for statistical calculations. For demographic data and the course evaluation responses, means and standard deviations were calculated. For the pretests and posttests, we calculated the percentage of answers correct for each question. We used McNemar χ2 test for matched data to identify improvements in specific questions for each learner after taking the course. Overall knowledge was compared by calculating the total number of answers correct on both the pretest and posttest and by then comparing these using Wilcoxon signed rank test. If a participant did not complete both the pretest and the posttest, his or her knowledge assessment data were excluded from the paired statistical analysis. The study qualified for exemption of approval by the Wayne State University Human Investigation Committee (protocol 0912007851).
Results
Of 174 eligible first-year residents, 138 participated in the QuESST course (79.3%). Of these, 51, 38, 29, and 17 participated in the first, second, third, and fourth sessions, respectively. One hundred thirty-five (97.8%) completed the background and demographic questionnaire. The data are summarized in table 2.
The mean score for 132 participants (95.7%) who completed the pretest was 52.6%, and the mean score for 126 participants (91.3%) who completed the posttest was 71.2%. Our study found improvement in participant performance on each of the individual questions after the QuESST intervention. One hundred twenty participants (87.0%) completed both pretest and posttest and were included in the paired data comparison. Analysis of the paired data revealed that the observed improvement was statistically significant for the overall score and for 7 of 8 individual questions on the knowledge assessments. table 3 summarizes these results.
Of 138 participants, 125 (90.6%) completed course evaluations. In general, the participants reacted favorably to all aspects of the course. Specifically, they scored 1.8 on a scale of 1 (strongly agree) to 5 (strongly disagree) that the course “should be taught again (allowing for improvement).” Details of the agreement score results from the course evaluation are given in table 4.
Discussion
The QuESST curriculum was well received by participants, who gave the course favorable evaluations. Participants reported that it improved their understanding of general and specific topics of quality and patient safety and indicated that the curriculum should be taught to subsequent residency classes. Based on this feedback, we intend to repeat the course on a yearly basis to all incoming first-year residents.
There was a trend toward improved evaluation scores from the first session through the fourth session. Based on our experience, we believe that 3 factors may have contributed. First, the course instructors inevitably gained experience with each session. Second, we made some minor adjustments to the curriculum after the first session to include more small-group activities and to increase participant interaction during larger didactic sessions. We found exercises such as large-group discussion of a video showing errors in communication and small-group analysis of vignettes of medical errors to be highly effective teaching techniques. Didactics remained important to efficiently meet the course objectives, but the interactive components improved participant interest and engagement. Third, the fact that the sessions had progressively fewer participants was a potentially contributing factor to improved evaluation scores. Based on our subjective observations during the implementation of the curriculum, smaller group size seemed to correlate with increased participant engagement. While the study design was not intended to provide objective evidence for a formal conclusion on optimal group size, it seemed that approximately 30 or fewer participants promoted maximal participant engagement. In addition to the favorable results of the course evaluations, the improved performance of the participants in their knowledge assessments following the QuESST course provides evidence for the effectiveness of the QuESST curriculum.
Our study has 2 limitations that warrant discussion. First, the pretest and posttest tools were identical, and this may have allowed participants to learn from the assessment tool itself. We chose this method to avoid the confounding variable that would arise from 2 different tests' being implemented before and after the QuESST intervention. Randomizing participants to 2 groups that take different knowledge assessments in opposite order (1 group would complete assessment A before QuESST and assessment B after, and the other group would complete assessment B first and assessment A after) may reduce the potential for instrument reactivity, and we are considering using this measurement tool in subsequent years. Second, the time frame of the pretest and posttest methods did not allow for assessment of long-term retention. We chose a short-term experimental design to ensure optimal participant response rates on the posttest assessments. Long-term follow-up posttests are being considered for future studies.
Despite these limitations, we believe that the results support the effectiveness of the QuESST curriculum. Overall and on 7 of 8 questions, participants performed better on the knowledge assessment after the QuESST curriculum, with a high degree of statistical significance. We think that these results represent true improvement in participant knowledge and understanding of quality and patient safety.
Based on our experience and on subjective feedback from participants, it is apparent that the small-group educational format was highly effective. In addition, our experience pointed to the fact that residents might benefit from additional focused training in a few of the more important quality topics introduced in the 4-hour QuESST course. These topics included core measures and data transparency, medication safety, infection control, transitions of care and patient-centered care, and resident personal experiences with patient safety and medical error. It is our intention to add 5 small-group sessions covering these topics (1 per month over 5 months) to further enhance the QuESST curriculum. The plan is to require that residents attend 2 of 5 sessions, with the intention that if they are found to be effective a greater proportion will become mandatory in future years. We intend to divide the 174 first-year residents into mentoring groups of 15 to 20, with a consistent mentor for all 5 sessions.
The favorable results of the QuESST evaluation and the pretest and posttest comparison provide strong indication of the effectiveness of this type of training. We acknowledge that a randomized trial correlating the QuESST intervention with clinical outcomes would be ideal. However, correlation of quality and patient safety training to clinical outcomes has proven to be difficult in multiple studies.15,16 We believe that the combination of the results from the evaluation and the improved scores on the knowledge assessments provides sufficient evidence to support establishing the QuESST curriculum as a regular yearly course for first-year residents at our institution, and we recommend that other teaching centers adopt similar curricula.
Conclusions
The QuESST curriculum appears to be the first formalized comprehensive cross-disciplinary resident quality and patient safety curriculum to have been studied to date. Despite health care oversight and multiple entities citing the importance of quality and patient safety training, including the Accreditation Council for Graduate Medical Education, most physician educational systems depend on informal, sporadic, and, at best, noncomprehensive training opportunities for residents in this area of essential knowledge. The success of QuESST as judged by improved scores on a general quality knowledge assessment and by favorable participant evaluations of the curriculum is highly encouraging. Further testing for long-term knowledge retention is required. However, we are so encouraged by the results of this study that we intend to make the QuESST curriculum mandatory early in the first year of training for every resident at our institution from this year forward. In addition, we also believe that QuESST could be useful as a template for other institutions to develop their own similar curricula. Consequently, we are considering implementing methods to assist other institutions with this process.
References
Author notes
Martin A. Reznek, MD, MBA, is Director of Clinical Operations and Assistant Professor at the Department of Emergency Medicine, UMass Memorial Medical Center, Worcester, Massachusetts. Dr Reznek was formerly Vice President of Quality and Patient Safety, Detroit Receiving Hospital, and Assistant Professor, Department of Emergency Medicine, Wayne State University School of Medicine, Detroit, Michigan. All other authors are at Wayne State University School of Medicine. Bruno DiGiovine, MD, is Associate Chairman and Chief Quality and Safety Officer, Department of Internal Medicine, and Associate Professor, Division of Pulmonary and Critical Care Medicine; Heidi Kromrei, MA, is Academic Director, Graduate Medical Education; Diane Levine, MD, is Vice Chair for Education and Associate Professor, Department of Internal Medicine; Wilhelmine Wiese-Rometsch, MD, is Assistant Dean, Graduate Medical Education; and Michelle Schreiber, MD, is Senior Vice President and Chief Quality Officer, Detroit Medical Center, and Assistant Professor, Department of Internal Medicine.