Residents-as-teachers (RATs) programs have been shown to improve trainees' teaching skills, yet these decline over time.
We adapted a commercial Web-based system to maintain resident teaching skills through reflection and deliberate practice and assessed the system's ability to (1) prevent deterioration of resident teaching skills and (2) provide information to improve residents' teaching skills and teaching program quality.
Ten first-year obstetrics-gynecology (Ob-Gyn) residents participated in a RATs program. Following the program, they used a commercial evaluation system to complete self-assessments of their teaching encounters with medical students. Students also evaluated the residents. To assess the system's effectiveness, we compared these residents to historical controls with an Objective Structured Teaching Examination (OSTE) and analyzed the ratings and the free text comments of residents and students to explore teaching challenges and improve the RATs program.
The intervention group outscored the control group on the OSTE (mean score ± SD = 81 ± 8 versus 74 ± 7; P = .05, using a 2-tailed Student t-test). Rating scale analysis showed resident self-assessments were consistently lower than student evaluations, with the difference reaching statistical significance in 3 of 6 skills (P < .05). Comments revealed that residents most valued using innovative teaching techniques, while students most valued a positive educational climate and interpersonal connections with residents. Recommended targets for RATs program improvement included teaching feedback, time-limited teaching, and modeling professionalism behaviors.
Our novel electronic Web-based reinforcement system shows promise in preventing deterioration of resident teaching skills learned during an Ob-Gyn RATs program. The system also was effective in gaining resident and student insights to improve RATs programs. Because our intervention was built upon a commercially available program, our approach could prove useful to the large population of current subscribers.
Editor's note: The online version of this article contains the survey instrument used in the study.
Resident as teacher courses improve teaching performance, but the improvement often is not sustained over time.
Adaptation of a commercial Web-based system to maintain resident teaching skills through reflection and deliberate practice, prevented deterioration of skills, and was able to collect information to improve residents' skills and teaching quality.
Small sample, single-site study, lack of randomization, observation effects and response bias in the students' evaluations.
The system ensured maintenance of teaching skills over time. Hosting of the program on a commercial platform could be useful for adoption or adaptation by a large population of programs.
Residents have an important role in educating medical students, with studies indicating that residents spend up to 20% of their time teaching medical students,1 and students view residents more than attending physicians as their teachers.2 Research has shown residents-as-teachers (RATs) programs improve resident teaching.3 In particular, obstetrics and gynecology clerkship evaluations have been shown to improve after implementing RATs programs.4 However, in a 2000 review, only 55% of residency program directors reported their residents received formalized teacher training.5 Studies suggest teaching skills deteriorate over time.6–8 Reinforcement of skills learned during RATs programs may minimize skill degradation; however, little information exists to support this assertion.
The Department of Obstetrics and Gynecology (Ob-Gyn) at The George Washington University Medical Center has a well-established RATs program.9 We adapted E*Value (Advanced Informatics, Minneapolis, MN), a Web-based assessment system, to enable residents to reflect on and improve their teaching skills through deliberative practice and self-assessment. The purpose of our intervention was to evaluate whether the adapted E*Value system (1) prevented resident teaching skill deterioration and (2) provided information to better understand our residents as teachers and improve the quality of our teaching program.
Ten first-year Ob-Gyn residents at The George Washington University participated in a RATs program in June 2007. Faculty provided six 1.5-hour skills workshops: teaching a skill, teaching at the bedside, giving feedback, orienting a learner, teaching around a case, and giving a mini-lecture. The same topic-specific standardized assessment forms were used to teach and assess resident skills in the workshops in the E*Value system and in an Objective Structure Teaching Examination (OSTE). Assessments used a 5-point Likert scale and included 2 open-ended questions on what went well and what could have been done better (later referred to as positive and negative comments). The assessment form is provided as online supplemental information.
From September 2007 through early January 2008, the residents performed 6 teaching encounters with third-year students, corresponding to each of the 6 workshop topics. After each, the resident used the appropriate standardized assessment form on the E*Value system to complete a self-assessment. Electronic reminders to complete the required evaluations were generated automatically by the system.
In completing each self-assessment, the resident noted the name of the student involved in the teaching encounter, which triggered the E*Value system to send a message to that student to evaluate the encounter by using the same standardized assessment form. Students were informed that residents would not receive the student evaluation until after the student completed the rotation and grades were entered.
Assessment of Skill Deterioration
Eight months after the RATs program, the residents completed an OSTE. Standardized learners were fourth-year medical students trained during a students-as-teachers course. Stations corresponded to the 6 skills taught in the RATs workshops. At each station, the standardized learner interacted with the resident and evaluated the resident's performance by using the topic-specific standard assessment form.
To determine whether the E*Value reinforcement system prevented skills deterioration, OSTE scores of this group were compared to those of like historical controls. Historical controls completed an identical RATs program and 6-station OSTE but were not exposed to the E*Value intervention aimed at reinforcing teaching skills (electronic prompts to complete their teaching encounters, completion of self-assessments, and receiving student evaluations). Controls completed their OSTE 3 weeks after participation in the RATs program, while the intervention group completed their OSTE 8 months after the RATs program.
Repeated studies have shown that teaching skills degrade over time without reinforcement (eg, >6 months6–8), and we hypothesized that if the reinforcement system was effective, degradation would be prevented and there would be no significant difference in the performance of both groups. A two-tailed Student t-test was performed of the overall mean across stations, followed by tests of differences of the 6 stations. Because the station-specific differences were secondary in purpose, no adjustments were made for multiple hypothesis testing. Parametric tests were used because the data approximated normal distribution. Coefficient alpha for the overall OSTE score, averaged across the 2 study groups, was .58. It should be noted that this may be an imprecise estimate because of the small sample size (n = 20).
Assessment of Resident and Student Perception of Resident Teaching
To determine if there were any significant differences in perceptions of the residents' teaching between students and residents, the Likert scale component of the standardized assessment forms were analyzed using the Wilcoxon signed rank test. A conceptual content analysis was performed of the open-ended comments as well. Two authors (J.M.K. and B.B.) independently reviewed and categorized comments based on the framework used in the original workshops9: prepare for the teaching encounter; perform the teaching encounter; process what occurred in the teaching encounter; create a positive learning climate. Once categorized, comments were analyzed inductively for themes. In each case, 2 authors (J.M.K. and B.B.) independently evaluated the comments and came to consensus on any discrepancies. A third author (M.P.) reviewed the comments, categories, and themes for accuracy and face validity as an additional step to ensure the credibility and trustworthiness of the qualitative analysis and results.
All 10 residents participated in the E*Value reinforcement process, completed self-assessments, and received student assessments of their performance in all 6 subject areas. Nine residents completed the OSTE.
Assessment of Skill Deterioration
The overall mean OSTE score ± SD for the intervention group was significantly higher than that of the control group (81 ± 8 versus 74 ± 7, respectively; P = .05, 2-sided). The intervention group also had higher mean scores on 5 of 6 individual stations, although only 2 scores reached statistical significance (table 1).
Quantitative Assessment of Resident and Student Perception of Resident Teaching
Resident self-assessments were consistently lower than their assessments by the students (table 2). This difference reached statistical significance in orienting a learner (P = .05), giving a mini-lecture (P = .004), and teaching at the bedside (P = .002). Students' ratings of the residents were highest for giving a mini-lecture and teaching at the bedside. Residents rated themselves highest in teaching around a case and teaching at the bedside.
Qualitative Assessment of Resident and Student Perception of Resident Teaching
Qualitative analysis revealed 56 comments from residents and 90 comments from medical students on what they did well (ie, positive comments); 40 comments from residents and 5 comments from students on areas for improvement (ie, negative comments). Positive comments indicated residents felt they performed well in preparing for and performing the teaching itself (n = 18 and n = 17, respectively). Students agreed, as shown by the number of positive comments from students in the same categories (n = 10 and n = 29, respectively). Positive comments from residents highlighted their use of innovative teaching techniques, visual aids, and modeling.
In addition, students offered 11 positive comments on how residents made them feel involved in patient care. Students also provided 50 positive comments, the largest by far in any category and about 4 times as many as the residents (n = 13). One-half (n = 25) of students' comments were related to the importance of creating positive interpersonal connections between residents and students as a part of effective teaching. This theme emerged in 7 resident comments.
Residents were most self-critical in the Process category (n = 15), especially in giving feedback (n = 12). Residents also indicated a need for improvement in the Learning Climate category, eg, finding more teaching time (n = 4).
Results suggest the E*Value reinforcement system may help prevent the deterioration process of skills documented in previous studies.6–8 The OSTE scores for residents involved in the reinforcement intervention, taken 8 months after their RATs workshop, were as high or higher than those of control group residents examined 3 weeks after the workshops. A number of factors may have contributed to the finding that students generally rated the residents higher than the residents themselves. Having participated in the teaching skills workshops, residents were familiar with the principles of effective teaching and as a result, may have been more self-critical.
Our pilot study demonstrates the efficacy of our E*Value-based intervention in reinforcing and evaluating resident teaching skills. This system also captured resident and student perceptions in a paperless manner that is easy to access. With this method, program directors can conveniently incorporate teaching feedback into residents' semiannual reviews, facilitating the development of residents' individualized learning plans for improvement. Using the semiannual evaluation venue to provide residents with feedback on teaching skills reinforces the value of educational competency in clinical culture and brings teaching skills into the mainstream alongside clinical skills, increasing the potential to enhance the quality of resident teaching for medical students and peers.
The study has several limitations. Results must be viewed cautiously as residents were not randomized, controls were historical, and sample size was small. Comparison across different time frames (ie, 8 months versus 3 weeks postworkshops) may have confounded the results, and additional practice opportunities for the intervention group may have contributed to their superior OSTE performance. Residents also may have placed greater emphasis on effective teaching strategies because they knew they were being evaluated. Although students were assured that residents would not see their evaluations until after grades were finalized, students may have withheld poor evaluations for fear of an impact on their final grade. In addition, the OSTE encompassed only 6 stations; 10 or more stations would have resulted in greater reliability but were not feasible given the amount of faculty and resident time involved in administering an OSTE. Finally, the study was performed in a single institution, which limits its generalizability. Future studies, including larger samples and other specialties, are needed to determine the validity and generalizability of the results of this study.
In our pilot study, a Web-based reinforcement system showed promise in preventing deterioration of resident teaching skills. The E*Value system was also useful in gaining insights from both residents and students, which can be used to guide resident performance reviews and enhance future RATs programs. Studies with larger randomized samples and controls are needed to test and confirm these results. Since our reinforcement system was built upon a commercial Web-based resident evaluation tool, if our results are confirmed, it could prove useful to the large population of current subscribers from multiple specialties.
All authors are at The George Washington University School of Medicine. Jennifer M. Keller, MD, MPH, is in the Department of Obstetrics and Gynecology; Benjamin Blatt, MD, is Director, CLASS Clinical Skills Center and Office of Interdisciplinary Medical Education, Department of Medicine; Margaret Plack, PT, EdD, is Interim Senior Associate Dean for Health Sciences Programs; Nancy D. Gaba, MD, is in the Department of Obstetrics and Gynecology; and Larrie Greenberg, MD, is an Internal Consultant, Faculty Development.
Funding: The authors report no external funding source for this study.