Background

Learners benefit more from narrative feedback than numerical scores on formative assessments, yet they often report that feedback is lacking in quality and quantity. Changes to the formatting of assessment forms is a practical intervention with limited literature regarding its impact on feedback.

Objective

This study explores the effect of a formatting change (ie, relocating the comment section from the bottom of a form to the top) on residents' oral presentation assessment forms and if this affects the quality of narrative feedback.

Methods

We used a feedback scoring system based on the theory of deliberate practice to evaluate the quality of written feedback provided to psychiatry residents on assessment forms from January to December 2017 before and after a form design change. Word count and presence of narrative comments were also assessed.

Results

Ninety-three assessment forms with the comment section at bottom and 133 forms with the comment section at the top were evaluated. When the comment section was placed at the top of the evaluation form, there were significantly more comment sections with any number of words than left blank (X2(1)=6.54, P=.011) as well as a significant increase in the specificity related to the task component, or what was done well (X2(3)=20.12, P≤.0001).

Conclusions

More prominent placement of the feedback section on assessment forms increased the number of sections filled as well as the specificity related to the task component.

Feedback is an integral part of learning and has been shown to change trainees' behavior.1,2  Learners benefit more from narrative feedback than numerical rating3,4  when it is timely, specific, and with a clear goal of where to take action.3,5-7  However, narrative feedback is often lacking in both quality and quantity,1,3,8,9  and medical educators face many barriers to providing feedback.1,2,5,10 

Despite evidence that medical learners benefit from narrative feedback, increasing the quality of this feedback remains a challenge. A potential intervention intended to emphasize the importance of narrative feedback, and possibly reduce the barrier of time, is moving the narrative comment section to the top of an assessment form. Previous research has tested this formatting change: one found no change in the specificity of comments,11  and another found improved narrative feedback when the format change was accompanied by prompts.12  In this study, we explored if a format change would result in more frequent and specific feedback. We evaluated the quality of the feedback according to 3 key elements (task, gap, and plan) of deliberate practice as described by Ericsson: description of the task as a well-defined learning goal (task), identification of a gap between observed performance and a superior standard (gap), and description of a learning or action plan (plan).10 

Context and Setting

This single-site study took place in the psychiatry residency program at the University of Manitoba in Canada. Postgraduate year (PGY)-3 core residents and PGY-5/PGY-6 child and adolescent subspecialty residents present at academic case presentations during their rotations. These presentations typically occurred biweekly, and the presenters were assessed by the round's attendees including psychiatrists, trainees, and other health professionals, using a paper assessment form distributed at the time of the event.

Data Collection

From January to June 2017, we collected the original academic rounds assessment form (Form 1, provided as online supplementary data) that included 10 Likert scale questions and a narrative comments section located at the bottom of the form. Starting in July 2017, a new assessment form (Form 2, provided as online supplementary data) was distributed at rounds and was identical to the previous form except that the narrative comments section was located at the top of the page. The change in form was not announced or explained to assessors, nor did any of the processes surrounding the use of assessment forms change, such as their distribution, collection, and reception by residents. S.C. recorded the word count and the presence or absence of any comment from each assessment form.

Data Analysis

To determine the presence and extent of the three components that facilitate deliberate practice, all de-identified narrative comments were coded using a tool designed by Gauthier et al13  and further adapted by Abraham et al.14  C.R., a medical education researcher, and S.C., a psychiatry resident, independently evaluated all narrative comments using the adapted tool. The raters first familiarized themselves with the tool and scored a small portion (10%) of randomly generated assessment forms together and discussed any discrepancies. The raters then independently evaluated the remaining forms and assessed their interrater reliability using the Kappa statistic. They subsequently met to discuss any discrepancies and agree on a rating.

Using SPSS Statistics 25 (IBM Corp), we calculated the differences before and after the format change in the quantity and specificity of comments with the chi-square test. Differences before and after the format change in word counts were calculated using an independent samples t test.

The study was approved by the Research Ethics Board at the University of Manitoba (Protocol #HS21443).

Assessors completed 93 assessment forms with the comments section on the bottom and 133 forms with the comments section on top. The kappa correction coefficients between the 2 coders' ratings of the components of task, gap and action were 0.937, 0.902, 0.941 respectively, showing a high level of agreement.

Evaluation of Deliberate Practice Components in the Narrative Comments

Figure 1 summarizes the proportion of deliberate practice components in the assessment forms before and after the form change. There was a significant difference in the specificity of comments before and after the form change related to the task, or what was done well (X2(3)=20.12, P=<.0001). There were no significant differences in respect to the gap and action scores.

Figure 1

Proportion of Components of Deliberate Practice in Narrative Feedback

Figure 1

Proportion of Components of Deliberate Practice in Narrative Feedback

Close modal

Evaluation of Mean Word Count and Presence of Comments

There was no significant difference in word count between the 2 assessment forms. The mean word counts when the comments section was located on the bottom of the form was 17.6 (SD=14.51) and 17.27 (SD=17.07) when the comments section was on top. There were significantly more comments present (ie, presence of any words within the comment sections as opposed to left empty) when the comments section appeared on top (X2(1)=6.54, P=.011). These results are illustrated in Figure 2.

Figure 2

Proportion of Evaluations With Completed vs Blank Narrative Comment Sections

Figure 2

Proportion of Evaluations With Completed vs Blank Narrative Comment Sections

Close modal

This study explored the effect of moving a narrative comment section of an assessment form to a more prominent position to determine whether this improved feedback. Our analysis of the written comments suggest that having the comment section at the top is associated with a modest increase in the number of comment sections completed as opposed to left blank, and an increase in specificity in the task component, or what was done well. There was no significant increase in word count or in descriptions of the gap component (difference between observed and desired performance) or the action component (how improvement might be accomplishment).

Similar to the study by Dory et al, we found that the location of the comment section resulted in improvements to the quality of narrative feedback, which suggest that location alone contributes to the quality of narrative comments. Our findings are similar to previous studies; compared to the gap and action components of deliberate practice, task was more frequently identified.11,12  One reason for this is that task may be the easiest component to identify by simply recording a detailed account of the task performed.11  The gap and action component may also be implied by some assessors through the numerical measures, both because it may be seen as being less time intensive and may be felt to protect the learner as more specific written feedback could be experienced as critical if not worded thoughtfully.

Results of this study suggest that narrative feedback or comment boxes located at the top of assessment tools has the potential to improve feedback provided to learners. Simple formatting changes to assessment tools require very little time and money and have the potential to benefit learners, educators, and even administrators. Additionally, future research might explore the effect of adding simple prompts to the top placement of narrative comment boxes. Furthermore, efforts should be made to decrease focus on less valuable aspects of assessments such as reducing the number of Likert ratings.

There are limitations to the study. This is a site- and discipline-specific study, and we focused on only one form of feedback that residents receive. Results may differ with other forms of feedback in different settings. Additionally, 2 of the study's co-authors participated in both phases of the assessments and were aware of the study beginning in phase 2; consequently, this may have changed how they completed assessments forms in the second phase of the study.

We found that placing the narrative comments section more prominently at the top of the form increased the feedback residents received by increasing the number of assessments with comment sections completed and increasing the specificity of the task component of the deliberate practice model.

The authors would like to thank Mark Bigder, MD.

1. 
Cantillon
 
P,
Sargeant
 
J.
Giving feedback in clinical settings
.
BMJ
.
2008
;
337
:
a1961
.
2. 
Ende J. Feedback in clinical medical education.
JAMA
.
1983
;
250
(6)
:
777
-
781
.
3. 
Perera
 
J,
Lee
 
N,
Win
 
K,
Perera
 
J,
Wijesuriya
 
L.
Formative feedback to students: the mismatch between faculty perceptions and student expectations
.
Med Teach
.
2008
;
30
(4)
:
395
-
399
.
4. 
Rolfe
 
I,
McPherson
 
J.
Formative assessment: how am I doing?
Lancet
.
1995
;
345
(8953)
:
837
-
839
.
5. 
Lewis
 
KD,
Patel
 
A,
Lopreiato
 
JO.
A focus on feedback: improving learner engagement and faculty delivery of feedback in hospital medicine
.
Pediatr Clin North Am
.
2019
;
66
(4)
:
867
-
880
.
6. 
Schartel
 
SA.
Giving feedback—an integral part of education
.
Best Pract Res Clin Anaesthesiol
.
2012
;
26
(1)
:
77
-
87
.
7. 
Ericsson
 
KA.
Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains
.
Acad Med
.
2004
;
79
(suppl 10)
:
70
-
81
.
8. 
Al-Mously
 
N,
Nabil
 
NM,
Al-Babtain
 
SA,
Fouad Abbas
 
MA.
Undergraduate medical students' perceptions on the quality of feedback received during clinical rotations
.
Med Teach
.
2014
;
36
(suppl 1)
:
17
-
23
.
9. 
Sender Liberman
 
A,
Liberman
 
M,
Steinert
 
Y,
McLeod
 
P,
Meterissian
 
S.
Surgery residents and attending surgeons have different perceptions of feedback
.
Med Teach
.
2005
;
27
(5)
:
470
-
472
.
10. 
Archer
 
JC.
State of the science in health professional education: effective feedback
.
Med Educ
.
2010
;
44
(1)
:
101
-
108
.
11. 
Pare
 
JR,
Kothari
 
AH,
Schneider
 
JI,
Jacquet
 
GA.
Does the location of a narrative comment section affect feedback on a lecture evaluation form?
Int J Med Educ
.
2017
;
8
:
133
-
134
.
12. 
Dory
 
V,
Cummings
 
BA,
Mondou
 
M,
Young
 
M.
Nudging clinical supervisors to provide better in-training assessment reports
.
Perspect Med Educ
.
2020
;
9
(1)
:
66
-
70
.
13. 
Gauthier
 
S,
Cavalcanti
 
R,
Goguen
 
J,
Sibbald
 
M.
Deliberate practice as a framework for evaluating feedback in residency training
.
Med Teach
.
2015
;
37
(6)
:
551
-
557
.
14. 
Abraham
 
RM,
Singaram
 
VS.
Using deliberate practice framework to assess the quality of feedback in undergraduate clinical skills training
.
BMC Med Educ
.
2019
;
19
(1)
:
105
.

Author notes

Editor's Note: The online version of this article contains the evaluation forms used in the study.

Funding: Sara Courtis, MD, received a Health Sciences Centre Medical Staff Fellowship Fund Research Award in support of this project.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

This work was previously presented at the virtual Annual Canadian Academy of Child and Adolescent Psychiatry Conference, September 12-15, 2020; the virtual International Conference on Residency Education, September 2020; and the virtual International Association for Medical Education Annual Conference, September 2020.

Supplementary data