It’s Friday afternoon. A continuity clinic preceptor looks at their intern’s electronic health record (EHR) inbasket. They see 150 unread messages, with many unaddressed phone calls and results. They panic, thinking about what critical results may have been missed.
Meanwhile, the intern receives a page from their clinic preceptor about their inbasket. They panic, thinking about messages they were uncertain how to address. Inbasket management has been a huge burden this year. They don’t know where to start.
Does this scenario feel familiar? EHR workload significantly contributes to burnout among medical trainees.1-3 As educators, we must reconcile the need to train residents how to proficiently and efficiently interact with the EHR as it is critically important for their future success and well-being. In addition, ensuring competency in digital health within the EHR is included within the Accreditation Council for Graduate Medical Education Milestones.4 In 2021, 20% of internal medicine program directors self-reported awareness of adverse events or near misses with unsupervised resident inbasket decisions.5 Fellowship programs have also struggled with the oversight of trainee inbasket management.6 Despite this, there is little published on how to supervise inter-visit care (care provided between visits), and very few programs have a structured monitoring process.5 EHR user-level performance metrics (ie, Epic Signal or Cerner Lights On Network), which quantify time spent in inbasket activities, have been suggested as a way to identify struggling residents and offer personalized training.5,7 While these objective performance metrics offer the opportunity for enhanced supervision and a way to provide actionable feedback for EHR skill improvement, we worry about how trainees will receive it. We are concerned that if this performance data are presented poorly or interpreted out of context, it may worsen resident anxiety, reduce internal motivation, and ultimately increase the risk of burnout.
How Do We Train Residents to Use the EHR?
EHR training after initial onboarding is highly variable by institution, and there are few studies to guide EHR training within graduate medical education.8 Some programs have created more robust curricula to improve EHR proficiency using regular mentorship.9 This is supported by prior research that EHR training should be longitudinal, one-on-one, and tailored to learner needs.2,5,9 We implemented a similar training model at our institution, yet faculty evaluations continue to report struggling residents who cite inbasket management as a major stressor. A survey of our residents revealed that 45 of 55 (84%) feel as though managing their inbasket leads to burnout. Improved EHR training, particularly as it pertains to improving efficiency, has been shown to reduce rates of burnout among learners.10,11 To improve how we mentor residents to be more proficient and efficient with their EHR inbasket, we decided to incorporate individual performance data into continuity clinic feedback meetings.
Learning Analytics to Improve Feedback on Inter-Visit Care
Learning analytics is the process of using data to provide learners individualized feedback.12,13 EHR analytics programs, like Epic Signal or Cerner Lights On Network, access user-level data to inform physicians of their EHR utilization including efficiency metrics (Figure 1). These programs have been used to guide one-on-one training for physicians on their EHR processes to improve efficiency.14 User data have also been used to identify risk of burnout using metrics such as “pajama time” (time spent in the EHR outside of business hours).15 Trainee user data can be used similarly to help residency programs identify residents struggling with inter-visit care and offer personalized coaching. EHR user data offer great potential to improve trainee digital competency through individualized training and increased trainee supervision. However, since this data is measured by cursor movement and clicks, it should be interpreted as an imperfect view of resident behavior. We worry that this performance data could be misused for summative evaluation and provoke learner self-criticism, which may worsen, rather than reduce, rates of burnout among trainees.
Features of EHR Data Analytics Programs
Features of EHR Data Analytics Programs
Precautions and Lessons Learned on Using EHR Performance Data
To standardize how EHR user data are framed to trainees, institutions may wish to identify a faculty member as an EHR analytics “champion.” This allows a program to focus on a key individual to access the data and receive advanced training on the analytics program, as data interpretation can be nuanced. Since the goal is to benefit resident well-being and learning, we recommend that faculty involve residents in the implementation of the process. We recruited a resident for the curricular redesign team and held informal focus groups in which the residents felt the EHR analytics data should be provided one-on-one by a longitudinal mentor whom they trust. Next, we trained existing clinic mentors on EHR analytics data interpretation and communicating this feedback to the residents as an opportunity for growth. We held a case-based training session for the mentors emphasizing key learning goals. We created resident “phenotypes” to help faculty more easily interpret the data (Figure 2). The training emphasized that data should be presented as formative information to identify areas for growth to ultimately improve their quality of life and enhance patient safety.
EHR Analytics Resident Phenotypes
The mentor-resident feedback sessions should be conversational and not unidirectional. The data should not be presented as disciplinary (“good” or “bad”) or comparative (“better than other residents” or “worse than other residents”). Disclaimers should be made, both to those delivering and those receiving the data on its limitations (Figure 1). Also, the data must be interpreted within context. For instance, a resident with a slow result turnaround time may be keeping certain results in their inbasket for quality improvement projects or for follow-up reminders despite already addressing the result. While delivering feedback, faculty should offer specific ways in which the residents can improve (use efficiency tools, reduce overdocumentation, effectively use ancillary support, etc). The Box offers an example of how we use EHR user data to give feedback to residents on their inbasket. Programs may offer residents opportunities to gain further insight into their personal data through the faculty champion or peer mentoring. Programs could create optional asynchronous modules to introduce efficiency tools. Spaced delivery of feedback helps the resident understand their process improvement over time and encourages a growth mindset. After implementation, programs should periodically assess how residents perceive the use of their data. We plan to survey residents on self-perception of inbasket management and burnout. Clinical competency committees should not use raw EHR analytics data, given its inaccuracies and required understanding of context. However, programs may uncover professionalism or serious patient safety issues, which could be escalated to program leadership.
Box Resident Case Example of Feedback Using EHR Analytics
A PGY-2 internal medicine resident is scheduled to meet their outpatient longitudinal mentor for their regular biannual feedback meeting. Beforehand, the faculty mentor reviews the resident’s EHR analytics metrics:
Time in Inbasket: 20 minutes per day, compared to average 8 minutes per day among peers
Time in Results: 6 minutes per day, compared to average 2 minutes per day among peers
Time in Notes: 100 minutes per clinic day, compared to average 50 minutes per day among peers
Result Turnaround Time: 2 days, compared to average 5 days among peers
Inbasket Message Received: Average number among peers in all categories
The faculty mentor and the resident discuss their EHR analytics data. The mentor discusses the pitfalls with the data and acknowledges that the data is not meant to be evaluative but to help them grow and prepare for their careers. Together they realize that efficiency seems to be the largest area for improvement. They see some outlier metrics that correlate when the resident had a family emergency. They develop a plan together to improve efficiency with their inbasket and clinic notes using specific tools, like quick actions and dot phrases.
At their next meeting together, the resident is much more satisfied with their inbasket management. They review the EHR analytics data again, which shows reduced time spent in their inbasket, more efficiency tools utilized, and less time spent in their clinic notes. The mentor has not observed any reduced quality of their clinic notes or observed any patient safety concerns with more efficient inbasket management. The resident is thankful for specific feedback and proud of their improvements.
Conclusion
EHR user data present a great opportunity to help residency programs enhance individual trainee education and experience. While one of our goals is to improve resident well-being, this feedback process may secondarily improve inbasket supervision and patient safety. However, it is critical that we avoid introducing this data as punitive as it may worsen resident well-being as seen with other competency-based resident assessment.16,17 Using performance data for assessment may increase learner anxiety through fear of digital surveillance and curtail internal motivation. To avoid these pitfalls, programs with similar initiatives should focus on framing them as formative feedback over summative evaluation, involve residents in the implementation process, and reevaluate how individual data are being utilized. To support the well-being of our residents and prepare them to safely manage patient care in an EHR-centric health care system, it is imperative for residency programs to establish data-driven, individualized, and formative feedback sessions on inbasket management with supportive, longitudinal mentors.
The authors would like to thank Jason Ojeda, MD at Thomas Jefferson University, Philadelphia, Pennsylvania, USA for assisting in the conceptualization of Figure 2.