ABSTRACT

Background

Trainees are responsible for conducting advance care discussions but are often stressed by this role.

Objective

We developed an instrument to determine whether residents could identify a clinical scenario that necessitated an examination of a patient's goals and preferences as they pertain to clinical care, and subsequently measured their readiness to engage in such discussions.

Methods

Participants responded verbally to open-ended case presentations and completed survey items. We scored responses according to proximity to idealized answers.

Results

The sample consisted of 44 internal medicine residents, 12 students, 5 hospitalists, and 3 palliative care attendings, all of whom volunteered for the study and participated in standard interviews. Residents had widely varying scores (range 0–12, maximum score of 15) on the scored open response items. For eliciting values, mean score increased with training, and students, trainees, and attending physicians had mean scores of 3.7, 5.7, and 8.7, respectively (P = .01). For recommending care, mean scores were 3.0, 6.5, and 9.3, respectively (P < .001). Scores were correlated closely with increasing clinical experience and inversely with self-reported stress when conducting a goals-of-care discussion. The Kuder-Richardson Formula 20 reliability for the instrument was 0.52. Interrater reliability for sections about eliciting and recommending care were 0.64 (P < .001) and 0.50 (P < .001), respectively. The 1-week test-retest reliability was 0.91 for open response items and 0.76 for Likert responses.

Conclusions

A verbally administered instrument can readily and rapidly characterize a trainee's readiness to participate in advance care planning with patients.

What was known and gap

Residents are expected to discuss advance directives with patients, but many do not feel prepared for this.

What is new

An instrument assessed residents' ability to judge the need for a discussion of care goals and preferences as well as their readiness to engage in this discussion.

Limitations

Single specialty, single site study limits generalizability; the study did not assess performance in practice.

Bottom line

A simple to administer verbal instrument assessed trainee preparedness to discuss advance care planning with patients.

Editor's Note: The online version of this article contains the survey instrument used in the study.

Introduction

Residents are often responsible for eliciting a patient's understanding of his or her underlying illness, and discussing how the patient's goals, preferences, and values inform their medical decision making. Despite increased trainee education on communication strategies and advance care planning discussions, residents are typically stressed by this role and are inadequately prepared.14 

While learning the skills necessary to effectively engage in advance care planning for patients facing a serious illness is challenging, training and clinical experience are both associated with increases in self-perceived competence in providing end-of-life care.5,6 

We were unable to identify an existing tool to evaluate residents' readiness to engage in discussions about advance care planning. Thus, we developed an instrument to determine whether residents could identify a clinical scenario (a patient with advanced and deteriorating chronic lung disease). This instrument necessitated a discussion about advance care plans, and determined if residents could effectively plan for such a discussion (instrument provided as online supplemental material). Respondents also completed a survey to assess self-reported competence and stress regarding discussions about advance care planning. We then conducted an analysis of the instrument's psychometrics.

Methods

Design of the Instrument

We sought to establish the determinants of a clinician's readiness to initiate a discussion of advance care planning with a patient at the end of life. The scenario presented a patient with progressive interstitial lung disease accompanied by severe dyspnea. We used a stimulus case that was relevant and common, and that did not have typical elements (such as metastatic cancer) that trigger discussions about advance care planning. We made the patient description sufficiently recognizable to be relevant for trainees at varying levels, and piloted the scenario description with 7 residents to test our recording system and time the administration. The pilot subjects suggested edits to the scenario to ensure that the case description was clear, was relevant, and contained all of the necessary elements.

We designed the survey questions to assess initial reactions to the patient's symptoms and presentation, and then to elicit the respondent's experience and comfort in leading family meetings and willingness to discuss prognosis and resuscitation options. We gathered respondents' initial impressions of the case from 2 open-ended questions (about eliciting goals and responding to those goals). The remaining items measured such areas as self-reported confidence, stress, and previous experience with discussions about advance care plans.

A panel of palliative care experts assessed the items for clarity, content, and accuracy as it related to the communications content.

Administration of the Instrument

The case was designed to be administered verbally, which would minimize the burden on the subject and allow for flexible response capture in a variety of clinical settings (eg, inpatient wards, ambulatory clinic). After obtaining assent, the research assistant read the scenario to the subject and recorded the responses. The research assistant was allowed to reread the scenario on request, but was not allowed to answer any questions or offer any reinforcing gestures or words. An experienced transcriptionist transcribed the responses. We designed the 11-item instrument to be completed by interview within 5 minutes.

Subjects

Our primary research population was recruited from a convenience sample of internal medicine residents at Brigham and Women's Hospital in Boston, Massachusetts. The invitation to participate was sent to all 172 residents in this training program; 44 responded and were available to participate in person during the study. We also recruited 12 first-year medical students, 3 palliative care experts, and 5 hospitalists, all from Harvard Medical School, to complete the instrument for reliability testing.

Subjects who opted in were approached in person during business hours, and verbal consent was elicited prior to initiating the survey tool. Each participant was offered a $5 gift card to a local food establishment. No identifier information was obtained from survey participants, although we did ask residents their year of postgraduate training and primary care versus categorical designation. Responses from hospitalists and palliative care experts were grouped together.

Scoring

We developed a scoring rubric to determine the proximity of provided responses to the open-ended questions to an ideal standard. The ideal standard was developed through consultation with several palliative care specialists. Each specialist first provided an independent open-ended response to the scenario. The specialists' responses were shared, and the idealized answer was iterated collaboratively until there was unanimous agreement on the components of an ideal response. Each element of the ideal response contributed to the score. We graded each response based on the accumulation of response elements.

Content Analysis

We performed a content analysis on the transcribed responses to the instrument. We deconstructed the responses to each question, and then grouped them by themes. We looked for themes that appeared to cluster with seniority or experience and sought to explore outliers and negative results.

Test-Retest and Interrater Reliability

We calculated the test-retest reliability of the scoring tool after administering the survey on 2 occasions over a 1-week period to 5 additional subjects in the same institution who had not taken the survey initially. Respondents received no feedback about their performance between administrations.

Interrater reliability was calculated by comparing the scores provided by each examiner on the same content. Spearman rank correlation (rho) was used to calculate the test-retest and interrater reliability.

The Institutional Review Board at Partners Healthcare approved the study.

Analysis

Survey responses were entered into a Microsoft Excel 2010 database, and validated through double-entry. The text responses were scored independently by 2 examiners, and the responses were analyzed and counted according to the scoring rubric described earlier.

The Likert scale and scored responses were analyzed using χ2 statistics. Between-group comparisons were performed with the Kruskal-Wallis test. Associations were quantified using Spearman rho. Statistical significance was accepted at P < .05.

Results

Forty-four internal medicine residents, 12 students, 5 hospitalists, and 3 palliative care attendings volunteered to participate and were interviewed.

Open-Ended Responses

The first scenario required a subject to express the need for an advance care discussion and to articulate how they would elicit those goals. All of the hospitalists and palliative care physicians achieved these goals. Residents less frequently recognized the scenario as requiring a discussion about advance care planning, less frequently sought the patient's perspective, and rarely sought the family's perspective. Students performed poorly; instead of recognizing the need to identify and discuss goals of care, they were more likely to state that they needed more data, they wanted to do or know something, or they needed to clarify components of the history.

The second scenario required the respondent to acknowledge and respond to the patient's articulation of their advance care preferences by offering further discussion and planning a family meeting. Hospitalists and palliative care physicians routinely included these plans in their responses (table 1). Residents varied a great deal in the number of elements of the ideal response that they included (range 0 to 12 from a maximum score of 15 on the scored open response items). Residents were the least likely of the 3 groups to report wanting to discuss the plan with the patient and family, and they were much less likely to refer to or acknowledge the concerns of the patient or family, or to plan a family meeting (table 1). Instead, residents were more likely to be goal-directed in addressing the patient's symptoms of anxiety and shortness of breath. Students were more likely than residents to want to discuss the problem further, address the concerns, and plan a family meeting, and students were also most likely to acknowledge that they did not know what to do.

TABLE 1

Responses to the Scenario by Group

Responses to the Scenario by Group
Responses to the Scenario by Group

Impact of Experience

Overall scores to the first 2 questions ranged from 0 to 12 of a maximum possible score of 15. For the question designed to establish ability to elicit preferences and goals, the mean score increased with training, and students, trainees, and attending physicians had mean scores of 3.7, 5.7, and 8.7, respectively (P = .01). For recommending care, mean scores were 3.0, 6.5, and 9.3, respectively (P < .001). For the Likert scale questions, there was a linear relationship with increasing clinical experience for competence, stress, and confidence (table 2).

TABLE 2

Mean Respondent Rating of Personal Competence, Stress, and Confidence in Leading an Advance Planning Discussion

Mean Respondent Rating of Personal Competence, Stress, and Confidence in Leading an Advance Planning Discussion
Mean Respondent Rating of Personal Competence, Stress, and Confidence in Leading an Advance Planning Discussion

Impact of Stress

Trainees' self-reported levels of stress were inversely correlated with their confidence (Spearman rho = −0.459, P < .001), and both self-assessed competence (−0.644, P < .001) and competence as judged by their responses to open-ended questions (−0.390, P = .002).

Instrument Reliability

The interrater reliability for questions about eliciting and responding to advance care plans for the entire sample was 0.638 (P < .001) and 0.501 (P < .001), respectively. Using the Kuder-Richardson Formula 20, reliability was calculated as 0.52. The 1-week test-retest reliability for the 5 subjects who asked to complete this assessment was 0.913 for open response items and 0.763 for Likert responses.

Discussion

We presented a stimulus case to trainees and faculty at various stages of professional development to determine whether residents could identify a clinical scenario that necessitated a discussion about advance care plans, and could effectively plan such a discussion. The case had a specific uncertainty of prognosis designed for a chronic disease model.7  We were able to demonstrate that a verbally administered and easily scored instrument could reliably determine a resident's ability to appropriately anticipate the need for a discussion of advance directives, elicit a patient's advance care goals, and then respond to that patient accordingly.

Residents' communication abilities in these domains varied widely. Increased experience appeared to result in progressively lower stress and increased competence, but regardless of measured ability, high stress was associated with low self-confidence. A clinician's stress level is known to affect the quality of patient interactions; physicians who are more stressed are typically more verbally dominant and listen less.8,9 

The strengths of our approach include the use of a typical, relevant case that was accessible to, and appropriate for, trainee resident physicians. Other strengths include the use of both open and fixed responses, the internal consistency of the data, its administration to professionals of varying seniority to establish a scoring range, and the establishment of test-retest and internal reliability data.

Our study was limited to a single institution and to internal medicine residents. We intentionally completed brief interviews. While responses to open-ended items have previously been shown to correlate with performance in practice, we did not observe the behavior of the study subjects.2 

This instrument lends itself to being administered at the start of, and periodically during, a residency to determine the readiness of residents to engage in advance care planning and to quantify the growth of those skills during a resident's development.

Communication and professionalism skills are increasingly critical components of a resident physician's skill set to practice medicine, and though often inadequately taught and assessed, these skills are required by programs and the profession.10,11  Since performance improved with experience and seniority, our results suggest that experience and supervision are likely to be useful in increasing resident self-confidence and competence with this task. Other data suggest that improved competence in this domain is best achieved through modeling and facilitated practice.

Conclusion

Our brief, verbally administered instrument showed early positive findings in assessing resident ability to judge the need for an advance care planning discussion, and to engage in such a discussion. With further testing and development, the tool could characterize a resident's ability to recognize the need for advance care planning and anticipate the key components of such a plan.

References

References
1
Szmuilowicz
E
,
el-Jawahri
A
,
Chiappetta
L
,
Kamdar
M
,
Block
S.
Improving residents' end-of-life communication skills with a short retreat: a randomized controlled trial
.
J Palliat Med
.
2010
;
13
(
4
):
439
452
.
2
Sharma
RK
,
Jain
N
,
Peswani
N
,
Szmuilowicz
E
,
Wayne
DB
,
Cameron
KA.
Unpacking resident-led code status discussions: results from a mixed methods study
.
J Gen Intern Med
.
2014
;
29
(
5
):
750
757
.
3
Tung
EE
,
Wieland
ML
,
Verdoorn
BP
,
Mauck
KF
,
Post
JA
,
Thomas
MR
,
et al
.
Improved resident physician confidence with advance care planning after an ambulatory clinic intervention
.
Am J Hosp Palliat Care
.
2014
;
31
(
3
):
275
280
.
4
Colbert
CY
,
Mirkes
C
,
Ogden
PE
,
Herring
ME
,
Cable
C
,
Myers
JD
,
et al
.
Enhancing competency in professionalism: targeting resident advance directive education
.
J Grad Med Educ
.
2010
;
2
(
2
):
278
282
.
5
Smith
AK
,
Ries
AP
,
Zhang
B
,
Tulsky
JA
,
Prigerson
HG
,
Block
SD.
Resident approaches to advance care planning on the day of hospital admission
.
Arch Intern Med
.
2006
;
166
(
15
):
1597
1602
.
6
Lo
B
,
Quill
T
,
Tulsky
J.
Discussing palliative care with patients. ACP-ASIM End-of-Life Care Consensus Panel. American College of Physicians-American Society of Internal Medicine
.
Ann Intern Med
.
1999
;
130
(
9
):
744
749
.
7
Billings
ME
,
Curtis
JR
,
Engelberg
RA.
Medicine residents' self-perceived competence in end-of-life care
.
Acad Med
.
2009
;
84
(
11
):
1533
1539
.
8
Goodlin
SJ
,
Quill
TE
,
Arnold
RM.
Communication and decision-making about prognosis in heart failure care
.
J Card Fail
.
2008
;
14
(
2
):
106
113
.
9
Ratanawongsa
N
,
Korthuis
PT
,
Saha
S
,
Roter
D
,
Moore
RD
,
Sharp
VL
,
et al
.
Clinician stress and patient-clinician communication in HIV care
.
J Gen Intern Med
.
2012
;
27
(
12
):
1635
1642
.
10
Colbert
CY
,
Mirkes
C
,
Ogden
PE
,
Herring
ME
,
Cable
C
,
Myers
JD
,
et al
.
Enhancing competency in professionalism: targeting advance directive education
.
J Grad Med Educ
.
2010
:
2
(
2
):
278
282
.
11
Rider
EA
,
Keefer
CH.
Communication skills competencies: definitions and a teaching toolbox
.
Med Educ
.
2006
;
40
(
7
):
624
629
.

Author notes

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

Supplementary data