Abstract
One barrier to systematically assessing feedback about the content or format of teaching conferences in graduate medical education is the time needed to collect and analyze feedback data. Minute papers, brief surveys designed to obtain feedback in a concise format, have the potential to fill this gap.
To assess whether minute papers were a feasible tool for obtaining immediate feedback on resident conferences and to use minute papers, with one added question, to assess the usefulness of changing the format of resident morning report.
Minute papers were administered at the end of internal medicine morning report conferences before and after changing the traditional combined format (all residents) to a separate format (postgraduate year [PGY] 1 met separately from PGY-2 and PGY-3 trainees). We collected information during 3 months during 2 traditional sessions and 8 sessions in the format that separated PGY-1s (3 for PGY-1 and 5 for PGY-2 and PGY-3). Participants responded to an item rating the usefulness of the session and 3 open-ended questions.
Trainees completed the forms in 2 to 3 minutes. Trainee assessment of the usefulness of internal medicine morning report appeared to increase after the change (4.09 versus 4.45 for PGY-1; 3.75 versus 4.38 for PGY-2 and PGY-3 residents).
Minute papers are practical instruments that provide manageable amounts of immediate feedback. In addition, minute papers can be adjusted slightly to help assess the impact of change. In that way, faculty can create an iterative process of feedback that models small cycles of change, a key quality improvement concept.
Background
Conferences and didactic sessions in residency training can lack formal learner-based opportunities for feedback and refinement. In addition, residency conferences are often changed to meet newly identified internal and external demands or requirements without a way to gauge the success of the change. Feedback and measurement of change in these situations may be time consuming and taxing to an already overburdened faculty.
Educators need a flexible, efficient, multipurpose instrument. Minute papers are brief surveys that obtain feedback from learners in a concise format and they have been used in college education to improve teaching.1 During the last 2 to 3 minutes of class, learners provide written, anonymous responses to some variation of the questions: “What was the most important thing you learned during the class?” and “What important question remains unanswered?”
Using minute papers to gather feedback has not been described in the graduate medical education literature. By expanding them with 1 or 2 additional items targeted for gauging the success of a change, this tool not only could provide immediate feedback to teachers about the current session but also help gauge the success of initiatives to improve the overall curriculum. We are not aware of published reports of its use in this way.
The purpose of this study was to determine if minute papers were feasible and useful in getting immediate feedback on educational sessions and to combine traditional minute papers with a Likert scale question to measure whether a change in a morning report conference format was successful.
Method
Setting and Participants
The setting was an internal medicine residency program at an urban medical center affiliated with a major medical school. Residents participated in morning report 3 times a week at the time of this study. The participants included the chairperson, residents, medical students, chief residents, and faculty. Traditionally, all levels of residents attended this conference together.
Intervention
In response to upper level (postgraduate year [PGY] 2 and PGY-3)] residents' concerns that the current format was directed at the medical students and first-year residents, we separated morning report into 2 conferences, 1 for each group (medical students and PGY-1s and PGY-2s to PGY-3s). The Institutional Review Board reviewed the study proposal and determined that it was exempt due to its educational nature.
Residents completed minute papers during the last few minutes of conference preintervention and postintervention (figure). The sign-in sheet was restructured to record the number of residents who attended each conference for the 2 groups. We collected minute paper responses during a period of 3 months (10 conferences): 2 sessions preintervention (traditional “combined” morning report) and 8 postintervention (split morning report format) with 3 sessions for PGY-1s and 5 for PGY-2s and PGY-3s.
Measures
Trainees responded to 4 questions (figure). We added the first question (rating the usefulness of the conference via a Likert scale) to the traditional minute papers to obtain a participant rating of the conference before and after reorganization. Ratings for question 1 were tabulated and averaged for each conference.
Analysis
We compared the usefulness ratings in the minute papers preintervention and postintervention using paired t tests. Given the small number of learners per session, P values were not calculated.
Results
Feasibility
The postintervention average response rate was 81.4% (based on 5 sessions) for PGY-2 and PGY-3 (table). The response rate for PGY-1 was 100% where total attendance was documented (2 of the 3 sessions). An accurate average for all 3 sessions is not possible because the sign-in sheet for the third session was not available. The most conservative approach to filling in the missing data is to use the largest observed number (n = 11) to calculate a response rate for the third PGY-1 session (7/11, 64%). This approach estimated an average response rate above 85% across sessions. Most residents completed the minute paper in less than 2 to 3 minutes as measured by wait times for returned forms.
Providing Feedback
Participants responded to each item providing feedback on the content of the conferences and identifying unanswered questions and suggestions for improvement. For unanswered questions, resident comments included “Would like to hear more about management of disease than differential diagnosis,” and “Perhaps we could focus on the diagnostic testing in sarcoidosis (more) than on management.” We used this feedback to tailor the conference to better address residents' needs.
Assessing Impact of Changed Format
We tabulated preintervention and postintervention responses to the first question to compare ratings of usefulness. Mean levels of usefulness for the trainees increased after the split: for PGY-1 the average means were 4.09 before versus 4.45 after; for PGY-2 and -3 the average means were 3.75 before versus 4.38 after the format change.
Discussion
Minute papers were quick to administer, well received, and suited for resident training forums. Despite their simplicity, the minute papers assess more than mere recall. They allow learners to evaluate and assess their understanding of the content to formulate a question.2 The minute papers were easily analyzed, and with an additional question and simple Likert scale, they were a feasible tool to provide feedback about the change in a structured format. We have now used it in larger groups and it appears well suited for any size group. The tool's usefulness to measure change needs to be evaluated in a larger study with an experimental design to address confounders and to compare it with other feedback methods (eg, online surveys, verbal feedback).
A more structured sign-in sheet after the change in format allowed for better calculation of response rates. The number of completed minute papers trended upward after the format change and could indicate improved engagement by the residents as the new format allowed for more discussions geared to the level of learner and smaller groups.
Our study has several limitations. First, this is a single-site, single-conference pilot study, which limits the generalizability of our findings. Second, we did not have an accurate count of participants for some of the morning report sessions and thus were not able to calculate response rates for some sessions. Finally, there are limitations in directly linking the upward trend in the session ratings to the format change alone. However, the temporal association of the improved ratings with the smaller more focused learning sessions and similar level of learners coupled with no other major design changes during the same time frame suggests the format change as one reason for the apparent improvement in ratings. Despite these limitations, the findings support the feasibility of this tool to obtain immediate feedback on content and its ability to quantify feedback so that it can potentially be used to measure change in educational forums.
Conclusion
An instrument such as minute papers is attractive to busy educators. Reserving 2 to 3 minutes at the end of the morning report conference creates time for brief reflection. Immediate feedback to the conference presenters provides an opportunity for “reverse feedback,” with the ability to address remaining questions at the next meeting and real-time responses to suggested changes in format.2
Minute papers offer educators quickly available feedback as well as a method to teach change concepts. This approach can facilitate education regarding key quality improvement concepts (eg, variation, measurement, and cycles of change), which are in line with the Accreditation Council for Graduate Medical Education requirements for residents to learn continuous improvement.3 Minute papers can measure small cycles of change, providing role modeling and hands-on learning about continuous quality improvement.4
References
Author notes
Mamta K. Singh, MD, MS, is Assistant Professor of Medicine at Case Western Reserve University, Louis Stokes Cleveland Veterans Affairs Medical Center; Renée Lawrence, PhD, is Research Specialist Medical Services and VA HSR&D Center for Implementation Practice & Research Support at Louis Stokes Cleveland Department of Veterans Affairs Medical Center; and Linda Headrick, MD, MS, is Senior Associate Dean for Education and Faculty Development and Professor of Medicine at University of Missouri-Columbia.
Funding: The authors report no external funding source.