ABSTRACT
Interns must recognize urgent clinical situations and know when to seek assistance. However, assessing this skill is challenging.
We explored whether graduating medical students could determine urgency of medical cross-cover scenarios and what factors were associated with this ability.
Sixty senior medical students enrolled in an internal medicine residency preparation course, and 28 experts were invited to take an assessment using 4 clinical vignette handoffs, each with 5 to 6 cross-cover scenarios. Respondents were asked whether they would evaluate the patient at bedside and notify their supervising resident. They were asked to rate their comfort managing the scenario, rate the urgency (1=low, 2=moderate, 3=high), and take a medical knowledge quiz. Student performance was categorized based on stratification of clinical urgency—those who underestimated (fourth quartile), accurately estimated (second and third quartile), and overestimated (first quartile) urgency. We examined differences between groups in medical knowledge, action, and confidence using analysis of variance and post-hoc Tukey Honestly Significant Difference test.
Fifty-eight students (96.7%) and 22 experts (78.6%) participated. Clear differentiation emerged between students' ability to estimate urgency on the 3-point urgency scale (lowest quartile: 2.15±0.11; mid-quartiles: 2.38±0.07; upper quartile: 2.61±0.10, respectively). Students who underestimated urgency were less likely to notify their supervising resident (P=.001) and less likely to evaluate a patient at bedside (P=.01). There was no difference in quiz score or comfort level.
Incoming interns vary in their abilities to recognize urgent scenarios, independent of medical knowledge and confidence.
Introduction
Interns must be able to accurately recognize urgent clinical situations and know when to notify a senior resident for assistance.1,2 These skills are especially important during times of cross-cover, defined as caring for hospitalized patients when the primary team is absent.3 Cross-coverage responsibilities have increased in the past decade, in part due to the Accreditation Council for Graduate Medical Education (ACGME) work hour restrictions and a resultant increase in night float shifts.4,5 Cross-cover is now pervasive, with one study estimating that a patient is cared for by a cross-covering physician for 61% of their hospital stay.6
During cross-cover, the intern often has reduced supervisory oversight and relies on information handed off from the primary team. To our knowledge there are no reliable assessment methods of cross-cover skills. This is a critical gap because failure of an intern to recognize a scenario that requires urgent evaluation risks patient safety. Further, there is an incomplete understanding of what factors influence cross-cover skills, rendering it difficult to develop preventative or remediation strategies.
We designed a vignette-based study to assess if graduating medical students appropriately determined clinical urgency of cross-cover scenarios. Our secondary aim was to determine if medical knowledge or confidence was associated with this ability.
Methods
Setting and Participants
Participants included 60 senior medical students enrolled in an internal medicine residency preparation course in the spring of 2020, just prior to graduation. Students had applied to an internal medicine categorical or preliminary year and had completed core clinical clerkships as well as 15 to 16 months of post-clerkship rotations, including 2 subinternships and 3 to 4 clinical electives. Experienced physicians consisting of 12 internal medicine senior medical residents and 16 hospitalist faculty were also invited to take the assessment. Experienced physicians were selected given high volume of cross-cover experience. Participation was consented.
Intervention
Four clinical vignettes were presented as a handoff in the I-PASS format.7 Two vignettes were modified based on prior assessments created by the authors,8,9 while the other 2 were created for this study. Each vignette was accompanied by 5 to 6 cross-cover scenarios representing a distinct clinical question (online supplementary data).
To demonstrate content validity and response process validity,10 experienced physicians were asked whether the content of the sign-out was realistic and contained sufficient information on a 5-point Likert scale (1=strongly disagree, 2=disagree, 3=neither disagree or agree, 4=agree, 5=strongly agree, N/A=I do not know). They were also asked about sign-out length (1=far too short, 2=too short, 3=similar in length, 4=too long, 5=far too long, N/A=I don't know).
Outcomes
For each scenario, respondents were asked about the action they would take, including whether they would evaluate the patient at the bedside. Students were asked whether they would notify their supervising resident (1=yes, within minutes; 2=yes, within hours; 3=no), and their level of agreement with the statement that they felt comfortable managing the scenario independently (1=strongly disagree, 2=disagree, 3=agree, 4=strongly agree). All respondents were asked to rate the urgency of each scenario (1=low, 2=moderate, 3=high), each carefully defined (Figure). Scenarios that ≥75% of experienced physicians agreed were of high clinical urgency were considered “emergent.” At the end of each scenario, respondents took a medical knowledge quiz created by the authors that was related to vignette content and presented in multiple-choice format, totaling 62 possible answer choices.
Analysis
Scenarios that experts agreed (>75% consensus) were low risk were excluded from the analysis. All other scenarios were included. Retrospectively, student performance was placed in 1 of 3 groups based on ability to stratify clinical urgency: Group 1 included the quartile of students who underestimated urgency, group 2 included students in the second and third quartiles and who most accurately assessed urgency, and group 3 included the quartile who overestimated urgency. We compared differences in group means between the 3 urgency estimator categories for medical knowledge, action, and confidence using analysis of variance and post-hoc Tukey Honestly Significant Difference test.
This study was determined exempt by The University of Michigan Hospital's Institutional Review Board.
Results
Fifty-eight students (96.7%) and 22 experienced physicians (78.6%) participated. Experienced physicians included 11 faculty members, 8 senior residents, and 3 chief residents. Experienced physicians agreed that content of the sign-out was realistic (4.18±0.80) and that its length was realistic (2.87±0.52). Three experienced physicians stated “N/A” regarding content and sign-out length. There was more variation in response regarding whether sufficient information was provided (3.4±1.1).
Seventeen scenarios were included in analysis with an average urgency rating of 2.39±0.15 from experienced physicians. Students in the lowest quartile rated these scenarios as lower urgency and also were less likely to notify their supervising resident in a timely manner and less likely to report evaluating a patient at bedside compared to those who overestimated urgency (Table).
There was no difference in medical knowledge score or comfort level managing the scenario independently among the 3 groups (Table). Experienced physicians performed better on the medical knowledge assessment compared to medical students (91.9±3.1% vs 82.3±5.3%, P<.001). Interestingly, the performance of experienced physicians was consistently high (ie, low within-group variation), whereas the student group was statistically different, providing further evidence of content validity.
Discussion
In this study we show that senior medical students preparing to transition to residency vary in their abilities to risk stratify potentially urgent cross-cover scenarios. Students that underestimate urgency report they would be less likely to notify their supervising resident and evaluate a patient at bedside, which has implications for patient safety. Interestingly, a student's ability to recognize urgent clinical scenarios appears to be independent of medical knowledge.
The ability to accurately estimate urgency helps inform how residents should “triage” pages that they receive in the hospital—matching both timeliness of care and resources available.11 Underestimating urgency can lead to a lack of appropriate timely diagnosis and treatment of potentially life-threatening conditions, especially in times of limited supervisory oversight such as cross-cover care. Overestimating urgency may lead to an increase in resources utilized, overuse of diagnostic testing, and perhaps unnecessary treatment.
It was surprising that quiz scores were not associated with ability to determine urgency, given that the questions were related to the vignette content. Our findings raise questions about the extensive use of medical knowledge tests to define competency prior to graduation and imply that more targeted assessments may be necessary prior to entrusting interns with cross-cover responsibilities. There is currently a national call for meaningful assessment data at the transition to residency12 ; such data could provide important baseline information for the learner and allow for testing efficacy of interventions.
Limitations include that this was a single center study with a small number of participants, which limits generalizability. Additionally, the computerized examination does not represent the clinical setting where a provider may call the nurse for more information, look up information online, or ask a colleague for assistance. We did not ask students about prior cross-cover experience, which could have affected performance.
Next steps include broadening to more participants. Additionally, it would be interesting to compare performance to other available assessments including board examinations, clinical grades, or residency milestone data.
Conclusions
In this single center study, we demonstrate variability in graduating medical students' ability to estimate clinical urgency of cross-cover pages and appropriately notify supervisory residents.
References
Author notes
Editor's Note: The online version of this article contains the assessments used in the study.
Funding: The authors report no external funding source for this study.
Competing Interests
Conflict of interest: The authors declare they have no competing interests.
Findings were presented as a virtual poster presentation at Academic Internal Medicine Week 2021 Online Virtual Conference, April 6-16, 2021.