Background Medical trainees must learn how to provide effective feedback as an essential communication skill, yet few models exist for training and assessing these skills.

Objective To develop an observed structured feedback examination (OSFE) to provide feedback training to pediatric fellows and assess changes in skills and self-reported confidence.

Methods This educational study was conducted from 2019 to 2020 at an academic children’s hospital. Our team developed the OSFE and trained standardized feedback recipients and faculty. Fellows completed baseline self-assessments (31 items) on prior exposure to feedback training, application of skills, and confidence. They then participated in the OSFE, giving feedback to a standardized recipient using a standardized scenario, and were scored by faculty and recipients using a 15-item checklist for performance. Next, fellows participated in feedback training and received individualized feedback, after which they repeated the OSFE and confidence self-assessment. Three months later, fellows completed self-assessments on confidence and application of skills and another OSFE to assess retention. Descriptive statistics and signed rank sum test were used for analysis.

Results Of 60 eligible fellows, 19 participated (32%), with 100% follow-up. After training and individualized feedback, all fellows improved feedback skills as measured by OSFE performance (mean change +0.89). All items, measured on a 5-point Likert scale, were sustained 3 months later (mean change +0.92). All fellows reported improved confidence in feedback knowledge (mean change +2.07 post, +1.67 3 months post).

Conclusions Feedback training using simulation and individualized feedback moderately improved fellows’ performance, confidence, and 3-month retention of feedback skills.

The ability to provide feedback is an important communication skill for medical trainees who are educators for students and residents.1,2  Without feedback, residents may continue incorrect practices, and good behaviors may not be reinforced.3  Most clinicians do not receive training in delivering feedback.1,4,5  While many publications describe various feedback models,2  few have evaluated feedback training during fellowship with objective measures.6-11  Our objective was to address this gap by providing training for fellows to improve their skills and confidence in providing feedback using simulated feedback situations2,12,13  through an objective structured feedback examination (OSFE).

This study was conducted from 2019 to 2020 at a children’s hospital that trains medical students, residents, and pediatric subspecialty fellows. We solicited participation from 20 medical and surgical subspecialties (n=73 fellows). Fellowship program directors completed a needs assessment survey describing what type of feedback training, if any, their fellows received, and whether they would allow fellows to participate. Nineteen of 23 directors (83%) responded; the majority (18 of 19, 95%) expressed interest in feedback training. The survey confirmed lack of feedback curricula in most fellowships, and those that provided feedback training were excluded. A recruitment email was sent to the 60 eligible fellows. They provided verbal consent to participate and were informed that their voluntary participation would not affect their fellowship assessments.

We followed Kern’s 6 steps for curriculum development (online supplementary data Figure 1).14  We created goals and objectives for our training and identified educational strategies (online supplementary data Figure 2) and measured outcomes with an OSFE, similar in format to an objective structured clinical examination (OSCE).15  We created the training session, scenario, and assessments based on literature review.1,2,4,16-18  The training included feedback goals, models, and strategies for giving and receiving feedback. The OSFE scenario involved a struggling intern (online supplementary data Figure 3), played by a standardized recipient; no script was provided to allow for variety. To facilitate consensus, standardized recipients and faculty were trained to score performances and practiced using the checklist with 3 test subjects.

Fellows scheduled a 1-hour session at the simulation center and completed pretraining self-assessments on prior exposure to feedback training; application (ie, how often they give feedback); and self-perceived knowledge of and confidence in ability to give feedback (online supplementary data Figures 4, 5, 6).

Fellows then participated in an OSFE using the common scenario and provided feedback to the standardized recipient. The faculty observed from a separate room using live-stream video to support objective assessment. One standardized feedback recipient and one faculty investigator assessed fellow performances using a checklist (online supplementary data Figure 7).10  Then fellows participated in the 20-minute interactive feedback training session. Afterwards, the observer and recipient gave fellows feedback on their performance. Fellows then participated in a second OSFE using the same scenario. To decrease “practice effect,” in which repeated evaluation results in improvement, the scenario had multiple problems that the standardized recipient exhibited. Before leaving the session, fellows completed another confidence self-assessment and a training evaluation.

Due to the COVID-19 pandemic, the 3-month post-assessment was adapted by offering 30-minute videoconferencing sessions. Fellows completed another confidence assessment and application assessment, then participated in a third OSFE using the same scenario. The faculty observer was hidden and muted. Performances were scored as before, and fellows were given feedback.

Our primary outcome measure was change in OSFE performance. Our secondary outcome measures were changes in self-reported confidence and application of feedback skills, which were measured on a 5-point Likert scale. To rate OSFE performance, we modified a published checklist, the FEEDME-Provider instrument, by removing questions that did not apply in simulation.10  OSFE performance was scored along several domains: self-assessment, providing corrective feedback, and facilitating bidirectional conversation (online supplementary data Figure 7). Fellows provided satisfaction ratings and suggestions for improvement at the end of the session.

Change in confidence and OSFE performance were assessed using signed rank sum test. Change in application of feedback skills were assessed using paired t test. All analyses were conducted using SAS v9.2. P value of <.05 was considered significant.

The protocol was exempted by the University of California San Diego Institutional Review Board.

We enrolled 19 of the 60 invited pediatric fellows (32%); 100% participated in all assessments. All fellows showed improvement in OSFE performance after feedback training and individualized feedback with a mean change of +0.89 (P<.001; Table 1) aside from 2 items already high at baseline. All fellows showed improvement in self-perceived confidence with a mean change of +2.07 (P<.001; Table 2). Three months later, improvement in skills (mean change +0.92, P<.001) and confidence (mean change +1.67, P<.001) were sustained.

Table 1

Observer and Recipient Evaluation of OSFE Pre-, Post-, and 3-Month Post-Feedback Training Session

Observer and Recipient Evaluation of OSFE Pre-, Post-, and 3-Month Post-Feedback Training Session
Observer and Recipient Evaluation of OSFE Pre-, Post-, and 3-Month Post-Feedback Training Session
Table 2

Changes in Fellow Self-Assessment of Confidence in Feedback Knowledge and Skills Pre-, Post-, and 3-Month Post-Feedback Training Session

Changes in Fellow Self-Assessment of Confidence in Feedback Knowledge and Skills Pre-, Post-, and 3-Month Post-Feedback Training Session
Changes in Fellow Self-Assessment of Confidence in Feedback Knowledge and Skills Pre-, Post-, and 3-Month Post-Feedback Training Session

All fellows reported prior exposure to the feedback sandwich model. Most (63%, 12 of 19) reported prior exposure to the ask-tell-ask feedback model. Few had prior exposure to just-in-time feedback (42%, 8 of 19). Most fellows reported prior exposure to skills such as how to deliver formal sit-down feedback (74%, 14 of 19), deliver feedback to a problem learner (58%, 11 of 19), receive feedback (84%, 16 of 19), and self-assess (79%, 15 of 19). Only 32% (6 of 19) reported exposure to how to direct a learner to self-assess. At baseline, only 11% (2 of 19) reported giving sit-down feedback at least once every 2 to 4 weeks; only 21% (4 of 19) reported giving just-in-time feedback. Three months after training, 21% (4 of 19) reported performing sit-down feedback at least once every 2 to 4 weeks (P=.06); 63% (12 of 19) reported giving just-in-time feedback (P=.01).

Fellows (N=18) evaluated feedback training after the initial session as positive (speakers 5/5 on 5-point Likert scale for organization, engagement, effective content delivery). All reported the session met its objectives, was relevant, logical, and clear (5/5), and rated audio/visual aids easy to comprehend, session useful, and recommended to colleagues as 4.9/5. Fellows noted they learned the ask-tell-ask feedback model, importance of learner self-assessment, bidirectional feedback, and scheduling feedback time. Suggestions included using less text in the training presentation.

Our feedback training program, using an OSFE and modified FEEDME-Provider tool10  for assessment, resulted in improved feedback skills in fellows immediately after intervention and 3 months later. Fellows had improved self-perceived confidence in feedback knowledge and skills.

Constructive feedback is vital to help trainees improve but challenging to deliver. Use of simulated standardized cases and feedback in a low-stakes environment may allow fellows to feel more comfortable applying these skills in the clinical setting; thus, simulation has been increasingly used in medical education.2,12,13  Combining interactive communication scenarios with didactic teaching may be beneficial to allow for multimodal learning.19,20 

As the feedback training program was well-received and showed sustained improvement in feedback performance and self-reported confidence and application of skills, these methods and assessments could be expanded to residents or faculty and utilized to address other topics in medical education.

Several features limited generalizability. This study was conducted at a single institution with low enrollment rate, possibly related to COVID-19 pandemic stresses. Participants may have been particularly motivated to learn feedback skills, leading to self-selection bias. Shifting to videoconferencing may have introduced variability in our results but was easier to schedule and familiar for participants.21-23  The OSFE scenario was the same throughout; practice effects might have biased results, though interactions were unscripted. Finally, this intervention was time- and personnel-intensive. However, we have successfully adapted our intervention to larger groups using small-group peer feedback for teaching and assessment. Further studies should use different scenarios and assess longer-term retention and need for booster sessions.

Giving feedback is a fundamental skill for clinicians, but formal feedback training is rare. We showed that feedback training using an interactive workshop, standardized recipients, and individualized feedback improved feedback performance, confidence in feedback skills and knowledge, and application of skills in pediatric fellows.

The authors would like to thank the pediatric subspecialty fellows for participating in the educational intervention as well as the pediatric subspecialty fellowship directors for completing needs assessments regarding feedback training within their curricula and allowing their fellows to participate. Thank you also to the Association of Pediatric Program Directors Leadership in Educational Academic Development mentors and Cohort 8 for providing support and mentorship.

1. 
Ende
J.
Feedback in clinical medical education
.
JAMA
.
1983
;
250
(
6
):
777
-
781
.
2. 
Lewis
KD,
Patel
A,
Lopreiato
JO.
A focus on feedback: improving learner engagement and faculty delivery of feedback in hospital medicine
.
Pediatr Clin
.
2019
;
66
(
4
):
867
-
880
.
3. 
Bing-You
RG,
Trowbridge
RL.
Why medical educators may be failing at feedback
.
JAMA
.
2009
;
302
(
12
):
1330
-
1331
.
4. 
Cantillon
P,
Sargeant
J.
Giving feedback in clinical settings
.
BMJ
.
2008
;
337
:
a1961
.
5. 
Bahar-Ozvaris
S,
Aslan
D,
Sahin-Hodoglugil
N,
Sayek
I.
A faculty development program evaluation: from needs assessment to long-term effects, of the teaching skills improvement program
.
Teach Learn Med
.
2004
;
16
(
4
):
368
-
375
.
6. 
Sargeant
J,
Lockyer
J,
Mann
K,
et al.
Facilitated reflective performance feedback: developing an evidence- and theory-based model that builds relationship, explores reactions and content, and coaches for performance change (R2C2)
.
Acad Med
.
2015
;
90
(
12
):
1698
-
1706
.
7. 
Sargeant
J,
Mann
K,
Manos
S,
et al.
R2C2 in action: testing an evidence-based model to facilitate feedback and coaching in residency
.
J Grad Med Educ
.
2017
;
9
(
2
):
165
-
170
.
8. 
Milan
FB,
Parish
SJ,
Reichgott
MJ.
A model for educational feedback based on clinical communication skills strategies: beyond the “feedback sandwich
.”
Teach Learn Med
.
2006
;
18
(
1
):
42
-
47
.
9. 
Tekian
A,
Watling
CJ,
Roberts
TE,
Steinert
Y,
Norcini
J.
Qualitative and quantitative feedback in the context of competency-based education
.
Med Teach
.
2017
;
39
(
12
):
1245
-
1249
.
10. 
Bing-You
R,
Ramesh
S,
Hayes
V,
Varaklis
K,
Ward
D,
Blanco
M.
Trainees’ perceptions of feedback: validity evidence for two FEEDME (feedback in medical education) instruments
.
Teach Learn Med
.
2018
;
30
(
2
):
162
-
172
.
11. 
Stone
S,
Mazor
K,
Devaney-O’Neil
S,
et al.
Development and implementation of an objective structured teaching exercise (OSTE) to evaluate improvement in feedback skills following a faculty development workshop
.
Teach Learn Med
.
2003
;
15
(
1
):
7
-
13
.
12. 
Hepps
JH,
Clifton
EY,
Calaman
S.
Simulation in medical education for the hospitalist: moving beyond the mock code
.
Pediatr Clin North Am
.
2019
;
66
(
4
):
855
-
866
.
13. 
Motola
I,
Devine
LA,
Chung
HS,
Sullivan
JE,
Issenberg
SB.
Simulation in healthcare education: a best evidence practical guide. AMEE guide no. 82
.
Med Teach
.
2013
;
35
(
10
):
e1511
-
e1530
.
14. 
Kern
DE,
Thomas
PA,
Hughes
MT.
Curriculum Development for Medical Education: A Six-Step Approach
.
Johns Hopkins University Press
;
2009
.
15. 
Harden
RM,
Lilley
P,
Patricio
M.
The Definitive Guide to the OSCE: The Objective Structured Clinical Examination as a Performance Assessment
.
Elsevier Health Sciences
;
2015
.
16. 
French
JC,
Colbert
CY,
Pien
LC,
Dannefer
EF,
Taylor
CA.
Targeted feedback in the Milestones era: utilization of the ask-tell-ask feedback model to promote reflection and self-assessment
.
J Surg Educ
.
2015
;
72
(
6
):
e274
-
e279
.
17. 
Ramani
S,
Krackov
SK.
Twelve tips for giving feedback effectively in the clinical environment
.
Med Teach
.
2012
;
34
(
10
):
787
-
791
.
18. 
van de Ridder
JM,
Stokking
KM,
McGaghie
WC,
ten Cate
O.
What is feedback in clinical education?
Med Educ
.
2008
;
42
(
2
):
189
-
197
.
19. 
Turner
DA,
Narayan
AP,
Whicker
SA,
Bookman
J,
McGann
KA.
Do pediatric residents prefer interactive learning? Educational challenges in the duty hours era
.
Med Teach
.
2011
;
33
(
6
):
494
-
496
.
20. 
Mahan
JD,
Stein
DS.
Teaching adults—best practices that leverage the emerging understanding of the neurobiology of learning
.
Curr Probl Pediatr Adolesc Health Care
.
2014
;
44
(
6
):
141
-
149
.
21. 
Almarzooq
ZI,
Lopes
M,
Kochar
A.
Virtual learning during the COVID-19 pandemic: a disruptive technology in graduate medical education
.
J Am Coll Cardiol
.
2020
;
75
(
20
):
2635
-
2638
.
22. 
Dedeilia
A,
Sotiropoulos
MG,
Hanrahan
JG,
Janga
D,
Dedeilias
P,
Sideris
M.
Medical and surgical education challenges and innovations in the COVID-19 era: a systematic review
.
In Vivo
.
2020
;
34
(3 suppl)
:
1603
-
1611
.
23. 
Sahi
PK,
Mishra
D,
Singh
T.
Medical education amid the COVID-19 pandemic
.
Indian Pediatr
.
2020
;
57
(
7
):
652
-
657
.

The online supplementary data contains resources used in the study.

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

This work has been previously presented virtually at the Academic Pediatric Association Region 9 and 10 Virtual Meeting, January 30, 2021; Association of Pediatric Program Directors Annual Spring Meeting, March 23-26, 2021; Pediatric Academic Societies Meeting, April 30-May 4, 2021; Pediatric Hospital Medicine Conference, August 4-6, 2021; and UC San Diego Inaugural Symposium for Innovation in Medical Education, September 10, 2021.

Supplementary data