Background

Interns often conduct procedural informed consent discussions (ICDs), identified as a core entrustable professional activity. Deficiencies in the training process for ICDs span across specialties.

Objective

We provide evidence for a curriculum and assessment designed to standardize the training process and ensure ICD competency in surgical interns.

Methods

In March 2019, PowerPoint educational materials were emailed to one academic institution's new surgical interns, who in June participated in an onsite 1-hour role-play “hot seat” group activity (GA) with an untrained simulated patient, and in October completed a single trained simulated patient (real-time raters) verification of proficiency (VOP) assessment. Curriculum evaluation was measured through intern pre-/post-confidence (5-point scale), and the VOP's Cronbach's alpha and test-retest were examined. Data were analyzed with descriptive statistics, paired t tests, and 2-way random effects models.

Results

Of 44 new interns, 40 (91%) participated in the remote teaching and live GA and were assessed by the VOP. Pre-/post-GA confidence increased a mean difference of 1.3 (SD = 0.63, P < .001). The VOP's Cronbach's alpha was 0.88 and test-retest was 0.84 (95% CI 0.67–0.93, P < .001), with a 95% pass rate. The 2 first-time fail students required remediation. Time commitment included 1 hour maximum for individual training and implementation and 30 minutes for assessment. The use of volunteers and donated space mitigated additional costs.

Conclusions

Remote asynchronous and group skills teaching for new general surgical interns improved their confidence in conducting procedural ICDs. A patient-simulation verification process appeared feasible with preliminary evidence of retest and internal consistency.

Objectives

Develop a curriculum and assessment program with the intent to standardize the training process for informed consent discussions.

Findings

A training program that improves intern confidence and is able to identify marginal performers is feasible with minimal resources.

Limitations

This was a single institution study that did not have a comparative group.

Bottom Line

Multimodal educational interventions are feasible and possess validity evidence to support their use.

The process of obtaining informed consent—in which the patient (or surrogate decision-maker) is made aware of the nature of the procedure, expected benefits, potential adverse effects, alternatives, and consequences of not proceeding with the treatment in question—is critical to medical practice. During the process, case- and patient-specific factors are taken into account with a goal of maintaining patient autonomy.1  Performing informed consent discussions (ICDs) has also been described as a core entrustable professional activity (EPA) “that all medical students should be able to [do] upon entering residency, regardless of their future career specialty” by the Association of American Medical Colleges (AAMC).2  However, deficiencies in the training process for ICDs span across specialties.3 

Many graduating students report no formal training or clinical experience with performing an ICD.4  Published ICD training programs have used a combination of case studies, informal observations, videos, narrated lectures, quizzing, and role-play with some associated with demonstration of skills through objective structured clinical examination (OSCE) style scenarios.4–10 Extensive research in health professions education supports the use of simulation and its benefit of “lab to life” transference.11  As a surrogate for patient interactions, standardized patients are often used, and learners voice preference for this education technique.1214  However, significant time, personnel, and cost investment may limit curricula that use standardized patients.15 

We developed a curriculum and assessment process for new surgical interns to standardize the training process and ensure satisfactory communication and cultural competency for ICDs. We hypothesized that a blend of remote asynchronous and in-person skills session would be feasible, and we developed preliminary evidence to support a patient verification process for ICD skills.

Setting and Participants

In April 2019, new interns who matched into specialties forming the general surgery intern pool at a large urban academic medical center were emailed educational materials to be completed prior to starting. The materials included ICD information. In June, interns voluntarily participated in an in-person 2-day skills bootcamp that included a group facilitated practice (GFP) session on ICDs. In October a verification of proficiency (VOP) assessment was conducted.

Intervention

The multimodal ICD curriculum consisted of 2 phases:

  1. Remote learning: Written materials on ICDs were developed by the authors after a review of literature on best practices and discussion with experts in ethics. The slides were electronically distributed to the interns via a PowerPoint presentation through email (provided as online supplementary data).

  2. GFP session: The authors created a 60-minute session with guided discussion of the remote PowerPoint materials, clinical observations and experiences, and several hypothetical ethical scenarios.10  This included a round-robin “hot seat” activity that used elements of rapid-cycle deliberate practice with role play in which an intern volunteer or cofacilitator played the role of the patient. Individual hot seat moments were limited to 1 to 2 minutes, which allowed for frequent debriefing, questions, and opportunities to “retry” challenging areas or use different communication techniques. Two sessions were conducted with 20 interns and 1 to 2 facilitators per group.

Outcomes

Demographic information was obtained from all participants, including sex, matched specialty, prior formal ICD training, and previous clinical experience conducting an ICD. Interns completed a 15-item, Likert-type survey (scale 1–5) with validity evidence in prior studies, in areas of content and response process for use with students and residents for the purposes of studying the need for a formal ICD curriculum in the medical education system.4  The survey assessed pre-GFP and post-GFP skill confidence and post-GFP perceived value of the materials and group session. The survey was not tested. After the VOP session in October, interns completed a single item survey that asked if the GFP ICD session in June was helpful to their current clinical practice (Yes/No).

Performance assessments were conducted as OSCEs. A simulated environment was selected for logistical purposes that allowed for efficient testing of all interns in a structured HIPAA-compliant environment and timely provision of targeted constructive feedback.

The interns were given patient- and procedure-specific cognitive aids in preparation for 2 potential scenarios: laparoscopic appendectomy (LA) for uncomplicated acute appendicitis or central venous catheter (CVC) insertion in an incapacitated patient needing central access for medication (provided as online supplementary data). These 2 scenarios were selected because they are the most common procedures that surgical interns encounter. On assessment day, the interns were randomly assigned to perform only one scenario to reduce intern and facilitator time requirements. Thirty minutes were given for the activity: 5 minutes to prepare, 20 minutes to perform, and 5 minutes to debrief with the simulated patient rater.

Trained, non–medical education team members acted as the patient and rater. Assessments occurred in 2 large conference rooms, with each divided in half. All encounters were audio-recorded (Voice Record 2019, BejBej Apps, Coquitlam, Canada). Remediation, consisting of review of self-audio performance and one-on-one role-playing activities with the education fellows, was required for performance scores less than 31 out of 50.

The VOP assessment was designed to reflect ICD best practices, including key elements and cultural competencies as recommended by the AAMC.2  The authors modified a cognitive aid that was developed previously for patient assessment of clinician communications to use as an assessment rubric for the VOP.16 Ten skills considered important to ICDs were assessed on a 5-point global rating scoring system (Poor, Fair, Good, Very Good, Excellent), with a maximum potential sum score of 50 points (provided as online supplementary data).2,10,16,17 Through faculty consensus, a cut score of 31 was chosen as a “pass” score.

Three trained raters with varied professional backgrounds (1 undergraduate student, 1 postgraduate research assistant, and 1 administrative associate from the simulation center) were volunteer recruits to act as the patients. Training occurred in a group setting and included a discussion of relevant anatomy and procedural technique and role-play of scenarios; this required approximately 1 hour for all raters to feel comfortable with the material for each scenario. One rater (R.S.) participated in both LA and CVC scenarios and reevaluated the recorded performances 6 months later to examine test-retest reliability. To examine remote assessment, 3 additional blinded trained raters (A.K., E.G., T.A) independently rated all audio recording performances that were of complete quality (24 total).

Analysis

Descriptive statistics, paired t test, and Pearson's correlation, where appropriate for parametric and nonparametric data, were used for all quantitative data. Psychometric analysis of the VOP (internal consistency [Cronbach's alpha], item difficulty and discrimination, test-retest, and ICC) were examined. The ICC, test-retest estimates, and 95% CI were calculated based on a mean rating (k = 3), absolute agreement, and 2-way random effects model between the on-site and audio-only raters. Significance was determined at P = .05. Analyses were performed using RStudio 1.2.1335 software (RStudio, Boston, MA).

This study was approved as an exempt study by the Stanford University Institutional Review Board.

Demographics and Confidence

Forty of 44 eligible interns (91% response rate) participated in the GFP session and completed post-session surveys. Pre-/post-session surveys were available for 34 (77%) eligible participants, assessing demographic characteristics and confidence matched analysis (Table 1). Pre-/post-session confidence demonstrated a significant increase, from mean = 3 (standard SD = 1) to mean = 4 (SD = 1, P < .001) on the 1 to 5 Likert-type scale. (Figure provided as online supplementary data).

Table 1

Demographic Characteristicsa

Demographic Characteristicsa
Demographic Characteristicsa

All 40 post-GFP session surveys were reviewed for attitudes toward the utility of the deliberate practice session. The majority of interns agreed that the GFP session was useful (median = 5, IQR = 1), 36 (90%) interns “somewhat agreed” or “agreed” with the statement, and the remaining 4 (10%) were neutral. Thirty-one (78%) post-VOP (4 months after GFP) surveys were completed, with 29 (94%) responses attesting to the clinical usefulness of the GFP session.

Forty (91%) of the eligible interns participated in the VOP (23 [57%] CVC and 17 [43%] LA). Performance between the 2 scenarios differed significantly: mean = 39.6 (SD = 6.2, range 30–50) and mean = 46.9 (SD = 4.3; range 33–50; P < .001) for CVC and LA, respectively. Two (5%) of the tested interns did not pass and required 45 minutes of remediation, which included analytic review of performance with one of the surgical education fellows. “Check for patient understanding through ‘teach back'” (mean = 3.9, SD = 1.4) was marked lowest for both scenarios; “Prepared in advance about the patient's medical record including pertinent labs, imaging, cultural background, personal, and social history” (mean = 4.8, SD = 0.5) was marked highest for both.

Validity and Internal Structure of VOP Assessment

Twenty-four (60%) audio recordings were reviewed by the 3 audio-only raters. ICC for audio-only raters was 0.58 (95% CI 0.36–0.75; F(23,46) = 6.8; P < .001). ICC for audio raters and on-site raters was 0.54 (95% CI 0.28–0.73; F(23,69) = 45.7; P < .001). The test-retest coefficient was 0.84 (95% CI 0.67–0.93). Mean scores from audio-only raters were 34.8 (SD = 5.2), significantly lower than on-site scores. The correlation between each score for the 2 groups was 0.78 (95% CI 0.60–0.88, P < .001). See online supplementary data for a breakdown of scores for all recordings. See Table 2 for all item discrimination and Table 3 for internal consistency (Cronbach's α) for on-site and audio-only assessment.

Table 2

Verification of Proficiency Rubric

Verification of Proficiency Rubric
Verification of Proficiency Rubric
Table 3

Internal Consistency (Cronbach's α) of 2 Scenarios Using 10 Items on the Verification of Proficiency Rubric

Internal Consistency (Cronbach's α) of 2 Scenarios Using 10 Items on the Verification of Proficiency Rubric
Internal Consistency (Cronbach's α) of 2 Scenarios Using 10 Items on the Verification of Proficiency Rubric

Overall Feasibility

Individual learner driven review of content was estimated to take approximately 30 minutes to 1 hour. GFP required no additional props. Training to effectively facilitate a session required 30 minutes, and each session required less than 10 minutes for preparation. Approximately 1 hour per scenario was used to train the simulated patient. As the raters were members of the core education team, no additional cost was incurred for the training process. The 2 large conference rooms were within our dedicated simulation space and did not incur additional costs. Ten hours (including time to analyze results) split over 2 days were required to test 40 interns. The recording application was a free program.

A debrief was held and all educators involved with the curriculum expressed that this program was manageable and did not place undue burden on the team given the number of interns who underwent training and assessment.

This study identified deficits in the experience and self-confidence of entering surgical interns' ability to properly conduct an ICD. With training, the confidence of the interns increased, and they found value in the overall process. By using remote learning, large group sessions, and simulated patients, a relatively small team was able to successfully implement this rigorously developed curriculum and assessment without undue burden.

This study of a low-cost combined distance and on-site ICD skills training approach, with a novel verification process for early surgical trainees, builds on prior work using simulation or role-play in ICD training. Some studies focus more on communication skills such as “compassionate behavior.8,1822  Similar to our results, most simulation programs find that training with immediate feedback may sensitize the learner to desired behaviors and also identify areas for curriculum improvement within residency programs.8,19,2022 

Assessment using standardized patients have been reportedly used for assessment of residents; however, this verification of proficiency appears to be the first successfully conducted with simulated patients and, as reported, was a more cost-effective alternative.4,7  While the concept of this approach is not new, it does provide a feasible model with content, response process, internal structure, and consequential validity evidence.

As a single institution pilot study, generalizability may be limited. There was also a 4-month interval between the training session and verification of proficiency, thus, without a control group, satisfactory skills performance may have been related to learning during internship rather than the educational intervention. As there was no comparison group, it is unclear what the exact impact of the educational intervention was, compared to clinical acquisition. At this time, we have focused on multiple intern surgical specialties; however, many of the skills emphasized within this program are likely useable for health care providers of all areas and experience levels.

Future endeavors may examine performance with interns receiving remote PowerPoint presentations alone, compared to those who only participated in group role-play, compared to those who received no specific training. It would also be beneficial for interns to undergo assessment of multiple procedures for a more holistic evaluation of proficiency.

A combination of remote asynchronous and group skills teaching for new general surgery interns improved their confidence in conducting procedural ICDs, with little resource and time requirement. A patient-simulation verification process appeared feasible with preliminary evidence of retest and internal consistency.

The authors would like to thank Jenirose Santos, Marisol Rueda, Kristen Kayser, and those from the Goodman Surgical Education and Surgical Education Research Group who volunteered their time to these educational sessions.

1. 
Hanson
M,
Pitt
D.
Informed consent for surgery: risk discussion and documentation
.
Can J Surg
.
2017
;
60
(
1
):
69
70
.
2. 
Association of American Medical Colleges.
Core Entrustable Professional Activities for Entering Residency—EPA 11 Toolkit: Obtain Informed Consent for Tests and/or Procedures
.
2021
.
3. 
Association of American Medical Colleges.
Core EPA Publications and Presentations
.
2021
.
4. 
McClean
KL,
Card
SE.
Informed consent skills in internal medicine residency: how are residents taught, and what do they learn?
Acad Med
.
2004
;
79
(
2
):
128
133
.
5. 
Waisel
DB,
Ruben
MA,
Blanch-Hartigan
D,
et al
Compassionate and clinical behavior of residents in a simulated informed consent encounter
.
Anesthesiology
.
2020
;
132
(
1
):
159
169
.
6. 
Anderson
TN,
Aalami
LR,
Lee
EW,
et al
Perception and confidence of medical students in informed consent: a core EPA
.
Surgery
.
2020
;
167
(
4
):
712
716
.
7. 
Koller
SE,
Moore
RF,
Goldberg
MB,
et al
An informed consent program enhances surgery resident education
.
J Surg Educ
.
2017
;
74
(
5
):
906
913
.
8. 
Nickels
AS,
Tilburt
JC,
Ross
LF.
Pediatric resident preparedness and educational experiences with informed consent
.
Acad Pediatr
.
2016
;
16
(
3
):
298
304
.
9. 
Lee
SC,
Nguyen
V,
Nguyen
A,
et al
Teaching anesthesiology residents how to obtain informed consent
.
J Educ Perioper Med
.
2019
;
21
(
4
):
e632
.
10. 
Thompson
BM,
Sparks
RA,
Seavey
J,
et al
Informed consent training improves surgery resident performance in simulated encounters with standardized patients
.
Am J Surg
.
2015
;
210
(
3
):
578
584
.
11. 
Antoniou
A,
Marmai
K,
Qasem
F,
et al
Educating anesthesia residents to obtain and document informed consent for epidural labor analgesia: does simulation play a role?
Int J Obstet Anesth
.
2018
;
34
:
79
84
.
12. 
American College of Surgeons.
ACS/ASE Medical Student Core Curriculum: Informed Consent Primer
.
2021
.
13. 
Yudkowsky
R,
Park
YS,
Downing
SM,
McGaghie
WC,
Issenberg
SB.
Simulations in assessment
.
In:
Assessment in Health Professions Education
.
New York, NY
:
Routledge;
2020
:
245
268
.
14. 
Yudkowsky
R,
Park
YS,
Downing
SM.
Performance tests
.
In:
Assessment in Health Professions Education
.
New York, NY
:
Routledge;
2020
:
217
243
.
15. 
Solymos
O,
O'Kelly
P,
Walshe
CM.
Pilot study comparing simulation-based and didactic lecture-based critical care teaching for final-year medical students
.
BMC Anesthesiol
.
2015
;
15
:
153
.
16. 
Morris
NA,
Czeisler
BM,
Sarwal
A.
Simulation in neurocritical care: past, present, and future
.
Neurocrit Care
.
2019
;
30
(
3
):
522
533
.
17. 
Leclercq
WKG,
Keulers
BJ,
Scheltinga
MRM,
Spauwen
PHM,
van der Wilt
GJ.
A review of surgical informed consent: past, present, and future. A quest to help patients make better decisions
.
World J Surg
.
2010
;
34
(
7
):
1406
1415
.
18. 
Ripley
BA,
Tiffany
D,
Lehmann
LS,
et al
Improving the informed consent conversation: a standardized checklist that is patient centered, quality driven, and legally sound
.
J Vasc Interv Radiol
.
2015
;
26
(
11
):
1639
1646
.
19. 
Appelbaum
PS.
Clinical practice. Assessment of patients' competence to consent to treatment
.
N Engl J Med
.
2007
;
357
(
18
):
1834
1840
.
20. 
Waisel
DB,
Ruben
MA,
Blanch-Hartigan
D,
et al
Compassionate and clinical behavior of residents in a simulated informed consent encounter
.
Anesthesiology
.
2020
;
132
(
1
):
159
169
.
21. 
Blanch-Hartigan
D.
An effective training to increase accurate recognition of patient emotion cues
.
Patient Educ Couns
.
2012
;
89
(
2
):
274
280
.
22. 
Blanch-Hartigan
D,
Ruben
MA.
Training clinicians to accurately perceive their patients: current state and future directions
.
Patient Educ Couns
.
2013
;
92
(
3
):
328
336
.

Author notes

Editor's Note: The online version of this article contains the informed consent discussion educational materials, procedure-specific cognitive aids, a visual of the frequency distribution of reported confidence levels, and a breakdown of scores for all recordings.

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

Supplementary data