Background

The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to monitor scheduling, work intensity, and work compression.

Objective

We aimed to create a model for assessing intern work intensity by examining patient and clinical factors in our electronic health systems using multiple linear regression.

Methods

We identified measurable factors that may contribute to resident work intensity within our electronic health systems. In the spring of 2021, we surveyed interns on pediatric hospital medicine rotations each weekday over 5 blocks to rank their daily work intensity on a scale from -100 (bored) to +100 (exasperated). We queried our electronic systems to identify patient care activities completed by study participants on days they were surveyed. We used multiple linear regression to identify factors that correlate with subjective scores of work intensity.

Results

Nineteen unique interns provided 102 survey responses (28.3% response rate) during the study period. The mean work intensity score was 9.82 (SD=44.27). We identified 19 candidate variables for the regression model. The most significantly associated variables from our univariate regression model were text messages (β=0.432, P<.0009, R2=0.105), orders entered (β=0.207, P<.0002, R2=0.128), and consults ordered (β=0.268, P=.022, R2=0.053). Stepwise regression produced a reduced model (R2=0.247) including text messages (β=0.379, P=.002), patient transfers (β=-1.405, P=.15), orders entered (β=0.186, P<.001), and national patients (β=-0.873, P=.035).

Conclusions

Our study demonstrates that data extracted from electronic systems can be used to estimate resident work intensity.

Considering a growing body of research linking physician workload to fatigue and burnout,1-3  the Accreditation Council for Graduate Medical Education (ACGME) updated guidance in 20171,4 requiring programs to monitor “trainee work intensity.”4  Trainee work intensity is impacted directly and indirectly by various subjective, objective, patient, and clinical factors, making the concept difficult to define and challenging to assess.5-8  Existing data within electronic health records (EHRs), residency management systems, and other clinical information can be reexamined to understand physician behavior, and ideally to help plan for better resource allocation and utilization.7-10  This data can also be used to better identify areas for further instruction of residents, like task prioritization or gaps in clinical exposure.6,10-14  One study examining inpatient admissions demonstrated an increase in both patient complexity and the EHR data burden over a 15-year period, suggesting an increase in the workload for trainees caring for these patients.7  Another, looking specifically at pediatric trainees, showed considerable workload variability between residents based on order entry and note documentation, but was unable to correlate these findings with subjective impressions of work intensity.6 

This study looked more broadly at patient and clinical factors within the EHR and other electronic systems to identify factors associated with increased work intensity, as reported by pediatric interns.

This study occurred at an urban, quaternary care, free-standing children's hospital. Pediatric interns were surveyed on weekdays over a 4-week hospital medicine rotation to minimize work pattern variability from December 16, 2020, to May 4, 2021. Surveys were distributed via an email link at the same time near the end of each day's shift. Periodic email reminders from the study team were used to encourage participation. The survey asked trainees to rank their same day work intensity on a continuous scale from -100 (boredom) to +100 (exasperated) to highlight the full spectrum of potential work intensity. At the lowest level, there may be so little to do that the intern is bored. On the contrary, at the highest level the intern may be so overwhelmed that any additional task is exasperating. Interns who rated their intensity as >90 were sent a follow-up email encouraging them to utilize available support resources. Scores were bucketed into 5 groups to reduce nonresponse bias: -2=-100 to -61; -1=-60 to -21; 0=-20 to 20; 1=21 to 60; 2=61 to 100.

To identify contributing clinical factors, we queried our residency management system, vendor analytics platform, and EHR via our enterprise data warehouse for work hours, number of pages, and number of text messages received on work phones, and inpatient encounters where participants documented a note, form, order, or performed medication reconciliation (Table 1). We then gathered demographic data for those patients to identify which patient factors might contribute to work intensity, including Pediatric Complex Chronic Condition categories and diagnoses.15  Our study site serves as a referral center, so we included patient geographical region based on their home address (local=within the greater Boston area; regional=within one of the 5 New England states; national=within the United States but outside of New England; and international=outside of the United States). In total, there were 19 explanatory variables: 8 patient factors and 11 clinical factors (Table 1).

Table 1

Measurable Variables of Workload Intensity

Measurable Variables of Workload Intensity
Measurable Variables of Workload Intensity

We used univariate followed by multivariate regression (with decision tree to account for interactions) to identify individual factors that correlate with subjective measures of work intensity. K-Nearest Neighbors (KNN) cross-validation was used to determine the most statistically significant model. This study was deemed exempt from human subjects review by the Boston Children's Hospital Institutional Review Board.

We surveyed 19 unique interns on 90 weekday shifts with a response rate of 28.3% (102 of 360). The average number of survey responses for any individual intern was 5.1 (SD=4.6). One intern rotated through pediatric hospital medicine twice during the study period and accounted for 27 of the total survey responses (26%), while 5 interns did not provide any survey responses. Work intensity scores ranged from -100 to 100 (mean=9.82, SD=44.27). Bucketed work intensity scores ranged from -2 to 2 (mean=0.2, SD=1.09). Three surveys prompted referral for support services. The online supplementary data Figure 1 shows the range of responses for each intern who participated.

Of the 19 explanatory variables studied, 5 were transformed from abnormal distributions. The variables most significantly associated with work intensity in the univariate analysis were text messages received, orders entered, and consults ordered (Table 2). Multivariate analysis with stepwise regression produced a Reduced Model (R2=0.247, F(4, 95)=7.78, P<.001) containing the variables text messages received, patient transfers, orders entered, and national patients (Table 2). Our decision tree analysis combined the reduced model and a possible interaction between orders and text messages (online supplementary data Figure 1); however, KNN cross-validation (k=10) confirmed the original reduced model to be the most statistically significant and final model.

Table 2

Summary of Regression Models

Summary of Regression Models
Summary of Regression Models

In this study we were able to successfully identify variables from the EHR and other electronic sources that are significantly associated with intern perceptions of work intensity. Our survey suggests that interns on our hospital medicine rotation experienced the full spectrum of work intensity, though individual intern responses clustered around negative, neutral, or positive work intensity (online supplementary data Figure 2). Clinical factors including orders entered and text messages received were consistently associated with increased levels of work intensity. Clinical events (including notes written), work hours, and patients seen were not significantly associated with the level of work intensity. Patient transfers to or from another hospital had a non-statistically significant negative association with work intensity. The only patient factor that was significantly associated with work intensity in the final model was out-of-area patients, which also had a negative coefficient. Though not entirely clear, it may be that patients referred from great distances within the United States sought care for a unique condition or had a care plan already in place. Other patient factors such as Complex Chronic Conditions, language spoken, and interpreter need did not reach statistical significance.

Our study supports evidence that the EHR can be used to measure trainee workload and that there is significant variation between trainees.6,9  Unlike Was et al, we were able to detect a an statistically significant correlation between self-perceived work intensity and objective measures such as number of orders entered and number of text messages received.6,9  These data account for about 25% of the variability in subjective scores of work intensity, which is reasonable given the multitude of factors that may contribute to a trainee's experience. Models like this could provide a method for program directors to passively monitor their trainees' workload, in addition to more invasive measures such as surveys and less specific measures like work hours.

Previous research in adults suggests that as patient complexity increased (rated using the Charlson Comorbidity Index), resident work intensity also increased7 ; however, we did not find an association between Pediatric Complex Chronic Conditions and work intensity. This could be because complex patients, while difficult to manage, also provide intellectual stimulation that provides meaning to one's work and tempers intensity. Work hours did not correlate, suggesting that it is the type of work that matters and not just the quantity, similar to findings in the nursing field.9  As our study focused on specific and measurable variables within our electronic systems, we are conscious that there may be other factors not studied that contribute meaningfully to work intensity. Importantly, these factors likely vary by clinical specialty and level of training, which may make it difficult to generalize across training programs. In addition, our study focused primarily on quantitative data in our electronic system, which might not capture the qualitative nature of what makes clinical care intense.

This study has limitations, most notably its narrow focus on pediatric interns on a hospital medicine rotation means the data may not be generalizable to other groups. The response rate was relatively low and, although a distribution of intensity scores was seen, there may be participation bias such that residents who experienced extremes of work intensity were more or less likely to fill out the survey. Similarly, because one resident represented about a quarter of the total survey responses, their perceived work intensity is overrepresented compared to others who filled out fewer or no surveys. Another limitation is that intensity scores were not measured on night shifts, weekends, and holidays, when interns cover more patients, and did not account for scheduling variability such as consecutive days worked. Lastly, while the quantity of orders and text messages were associated with work intensity, we did not examine the content of these factors, which may also play an important role in contributing to work intensity. For example, an order for parenteral nutrition could have a greater impact on subjective work intensity than an order for acetaminophen.

Our study demonstrates that data from electronic health records and other electronic sources can be used to model work intensity for pediatric interns.

The authors would like to thank the Graduate Medical Education Office for their support of this project and further studies, the residents who participated in this study, as well as Benjamin Yarsky and Ashley Doherty for their assistance.

1. 
Wolpaw
JT.
It is time to prioritize education and well-being over workforce needs in residency training
.
Acad Med
.
2019
;
94
(11)
:
1640
-
1642
.
2. 
Patel
RS,
Bachu
R,
Adikey
A,
Malik
M,
Shah
M.
Factors related to physician burnout and its consequences: a review
.
Behav Sci (Basel)
.
2018
;
8
(11)
:
98
.
3. 
McHill
AW,
Czeisler
CA,
Shea
SA.
Resident physician extended work hours and burnout. Sleep.
2018
;
41(8):zsy112.
4. 
Accreditation Council for Graduate Medical Education.
ACGME Common Program Requirements Section VI with Background and Intent. Accessed October 24,
2022
.
5. 
Fishbein
D,
Nambiar
S,
McKenzie
K,
et al
Objective measures of workload in healthcare: a narrative review
.
Int J Health Care Qual Assur
.
2019
;
33
(1)
:
1
-
17
.
6. 
Was
A,
Blankenburg
R,
Park
KT.
Pediatric resident workload intensity and variability
.
Pediatrics
.
2016
;
138
(1)
:
e20154371
.
7. 
Clark
AV,
LoPresti
CM,
Smith
TI.
Trends in inpatient admission comorbidity and electronic health data: implications for resident workload intensity
.
J Hosp Med
.
2018
;
13
(8)
:
570
-
572
.
8. 
Kaushal
A,
Katznelson
L,
Harrington
RA.
Beyond duty hours: leveraging large-scale paging data to monitor resident workload
.
NPJ Digit Med
.
2019
;
2
:
87
.
9. 
Womack
D,
Warren
C,
Hayes
M,
Stoyles
S,
Eldredge
D.
Evaluation of electronic health record-generated work intensity scores and nurse perceptions of workload appropriateness
.
Comput Inform Nurs
.
2021
;
39
(6)
:
306
-
311
.
10. 
Arora
VM.
Harnessing the power of big data to improve graduate medical education: big idea or bust?
Acad Med
.
2018
;
93
(6)
:
833
-
834
.
11. 
Nagler
J,
Pina
C,
Weiner
DL,
Nagler
A,
Monuteaux
MC,
Bachur
RG.
Use of an automated case log to improve trainee evaluations on a pediatric emergency medicine rotation
.
Pediatr Emerg Care
.
2013
;
29
(3)
:
314
-
318
.
12. 
Bachur
RG,
Nagler
J.
Use of an automated electronic case log to assess fellowship training: tracking the pediatric emergency medicine experience
.
Pediatr Emerg Care
.
2008
;
24
(2)
:
75
-
82
.
13. 
Mai
MV,
Orenstein
EW,
Manning
JD,
Luberti
AA,
Dziorny
AC.
Attributing patients to pediatric residents using electronic health record features augmented with audit logs
.
Appl Clin Inform
.
2020
;
11
(3)
:
442
-
451
.
14. 
Levin
JC,
Hron
J.
Automated reporting of trainee metrics using electronic clinical systems
.
J Grad Med Educ
.
2017
;
9
(3)
:
361
-
365
.
15. 
Feudtner
C,
Feinstein
JA,
Zhong
W,
Hall
M,
Dai
D.
Pediatric complex chronic conditions classification system version 2: updated for ICD-10 and complex medical technology dependence and transplantation
.
BMC Pediatr
.
2014
;
14
:
199
.

Author notes

Editor's Note: The online version of this article contains the survey used in the study and further data.

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

This work was previously presented as a poster at the American Academy of Pediatrics Virtual National Conference and Exhibition, October 8-11, 2021.

Supplementary data