Residents at risk for academic failure benefit from early identification and added educational opportunities and coaching. We developed a formal coaching program for residents that addresses medical knowledge and practice-based learning and improvement competencies. Through analyzing the annual in-training examination (ITE) scores, we identified residents with low percentile scores and found that this subgroup also did not have progressive improvements in their scores. We hypothesized that providing a program that facilitated coaching and active learning for these residents would improve their performance in both the medical knowledge and practice-based learning and improvement competencies.

Beginning in 2014, we used a journaling function contained in the residency program's evaluation management system to develop a facilitated coaching and active learning program for residents who performed poorly on the ITE. Specifically, residents were required to document self-identified monthly learning objectives through journaling. The individual resident's learning objectives had to meet the criteria of being Specific, Measurable, Attainable, Relevant, and Timely (SMART). Residents also had to document their achievements on previous goals. Residents were mandatorily enrolled in the SMART program if they did not meet the ITE cutoff of 25th percentile for interns or 35th percentile for second-year and third-year residents. The monthly journal entry was mandatory, and noncompliance resulted in consequences outlined in our program's administrative duties compliance policy. Residents were allowed to volunteer for the SMART program to preserve anonymity for low ITE score participants. Volunteers were excluded from the analyzed data set.

The evaluation management system notified residents when the monthly goals were due. After the resident submitted the monthly learning objectives, the journal entry was automatically transmitted via e-mail to the program director for review. The program director wrote timely feedback within the journaling program, and the feedback was returned to the resident via e-mail for review and response. Feedback focused on coaching residents to define their goals to meet the SMART characteristics, giving additional suggestions, and following up on prior goals. As an example, suppose a resident wrote a goal, “I will be reading on pulmonary this month.” The program director then inquired about the resource, asked how much time the resident would allocate to reading per day, and asked for a measurable goal such as achieving 70% correct on 30 completed test questions. Residents continued in the program for a year until the next ITE results were reviewed.

Our initial data have demonstrated that a larger percentage of at-risk residents have improved their ITE percentile after initiation of the program (P = .032, 2-sided Fisher exact test). In the 2 years prior to implementation of the SMART program, only 46% (12 of 26) of residents who met criteria for the program had an increase in their ITE percentile in the subsequent year. After implementation of the SMART program, 81% (17 of 21) of residents mandatorily enrolled in the program had an improvement in their ITE percentile the subsequent year. Another interesting finding was that residents who were mandatorily enrolled in the SMART program also had a statistically significant (P < .006, 2-sided Wilcoxon rank sum test) mean percent correct increase of 7.7% versus 4.0% for residents who did not meet enrollment criteria for the SMART program (figure).

figure

In-Training Examination Percent Correct Change

Note: The figure depicts the change in the percent correct on the in-training examination for residents mandatorily enrolled in the SMART program versus residents who did not meet the criteria. Mean change in percent correct was 7.7 for the SMART group versus 4.0 for the non-SMART group (P < .006).

figure

In-Training Examination Percent Correct Change

Note: The figure depicts the change in the percent correct on the in-training examination for residents mandatorily enrolled in the SMART program versus residents who did not meet the criteria. Mean change in percent correct was 7.7 for the SMART group versus 4.0 for the non-SMART group (P < .006).

Close modal

The SMART program required minimal setup by administrative staff since the evaluation management system had a journal entry function. If the evaluation system does not have a journal entry feature, a residency program can implement this practice-based learning and improvement project by creating a monthly evaluation form for journaling. Faculty members' or program directors' time to monitor and to give feedback on these entries depends on the number of residents in the program, but each journal entry review and subsequent feedback generally required 2 to 5 minutes of time. In addition, the program allowed the program director or faculty to monitor and coach the residents without regular in-person meetings. Residents have embraced this program as a way to stay focused on developing and achieving learning goals. These data suggest a benefit of expanding the SMART program in the future.