Background Specialty-specific individualized learning plans (ILPs) have been promoted to improve the undergraduate to graduate medical education transition, yet few pilots have been described.
Objective To create and report on the feasibility and acceptability of a pilot internal medicine (IM) ILP template.
Methods The ILP was created by a group of diverse IM expert stakeholders and contained questions to stimulate self-reflection and collect self-reported readiness data from incoming interns. US IM residency programs were invited to pilot the ILP with interns in spring 2022. Data was used at the programs’ discretion. The pilot was evaluated by a post-pilot survey of programs to elicit perceptions of the impact and value of the ILP and analyze anonymous ILP data from 3 institutions.
Results Fifty-two IM residency programs agreed to participate with a survey response rate of 87% (45 of 52). Of responding programs, 89% (40 of 45) collected ILPs, thus we report on data from these 40 programs. A total of 995 interns enrolled with 782 completing ILPs (79%). One hundred eleven ILPs were analyzed (14%). Most programs found the ILP valuable to understand incoming interns’ competencies (26 of 40, 65%) and areas for improvement (24 of 40, 60%) and thought it should continue (29 of 40, 73%). Programs estimated the ILP took interns 29.2±14.9 minutes and 21.6±10.3 minutes for faculty mentors to complete. The most common barrier was faculty mentor participation.
Conclusions An ILP based on interns’ self-reported data was feasible and valuable to IM residency programs in understanding interns’ competencies and areas for improvement.
Introduction
There is widespread mistrust between undergraduate medical education (UME) and graduate medical education (GME) regarding the handoff information provided about residents during the UME to GME transition. This has created a need to enhance communication to help residents transition to programs in which they will thrive and to ensure adequate supervision and resources to provide high-quality health care.1-3 Thus, the UME-GME Review Committee (UGRC) recommended specialty-specific individualized learning plans (ILP) be delivered at the start of residency to facilitate an optimal educational handoff.1-3
Previous educational handoffs have been completed in emergency medicine, surgery, and obstetrics and gynecology, but these did not incorporate ILPs.4-7 One previous multicenter pediatrics UME ILP pilot formed competency assessment committees to review student data and document performance in Accreditation Council for Graduate Medical Education (ACGME) Milestones.8 Students reviewed data, created learning goals, and sent them to residency programs. Graduating students could be assessed on ACGME Milestones; while moderately useful to describe incoming interns’ strengths and weaknesses, significant effort was involved to review data to map milestones and create ILPs.8
The Alliance for Academic Internal Medicine (AAIM) recommendations to improve learner transitions also described the need for an ILP.9 Since there was no existing ILP in internal medicine (IM), we aimed to create and pilot an ILP template for IM.
Methods
A group of diverse IM expert stakeholders was formed to create an ILP to pilot in IM to articulate incoming interns’ goals and areas for improvement.10 In addition to expert stakeholders, the ILP was shared with approximately 40 multi-institutional students and residents, and their feedback was incorporated. We aimed to create a template that was accessible to any learner/program dyad, regardless of access to resources such as dashboards or UME data.
The ILP contained information about goals, usage, and brief instructions. The first section asked for the top 3 goals and areas for improvement for the first 6 months of residency via open response. Next, 16 domains were listed with a 5-point Likert scale to rank preparedness for residency. This scale and question format had been previously piloted in the intended population.11 Domains were based on ACGME IM Milestones and the Association of American Medical Colleges Resident Readiness Survey Pilot, but refined through expert stakeholder feedback.12,13 The last portion contained a list of core IM topics, and interns were asked to select the 3 they felt least prepared for during internship. Topics were based on the AAIM core clerkship curriculum, AAIM core sub-internship curriculum, and expert stakeholder review.14,15 Interns were asked to complete the template independently and review it with a faculty advisor. Interns and faculty advisors signed the ILP and delineated the time required for completion.
We recruited IM residency programs by posting pilot information on a listserv of program directors in March 2022, and volunteers signed up via an online spreadsheet. Participation was voluntary, and no incentives were offered. After volunteering, designated contacts were emailed instructions and ILPs to share with incoming interns for completion in spring 2022. Pilot programs sent ILPs to interns as part of their residency onboarding process in 2022. ILPs were returned to programs and utilized at the programs’ discretion; for example, programs decided whether ILPs were mandatory and how reminders were sent and completion was tracked. UME educators were notified about the pilot through posts on a listserv of clerkship directors and were asked to share information with participating faculty.
A post-pilot survey for programs about their experience and perceptions (online supplementary data) was created utilizing best practices of survey design and extensive stakeholder review. Since we did not have the resources or mechanism to robustly survey interns, we asked programs to comment on the barriers their interns faced and what impact the ILP had based on their experiences obtaining, reviewing, and utilizing ILPs. This online anonymous survey was emailed to participating programs in fall 2022 with 4 reminders for nonresponse. Additionally, several months after the pilot, we solicited volunteers to analyze de-identified ILP data. Three pilot institutions (University of Chicago, University of California Los Angeles, and Geisinger) agreed to participate. Quantitative ILP and survey data was analyzed using Fisher’s exact test to compare university and community program data using Stata 17.0 (StataCorp LLC, College Station, TX). There was no funding for this pilot.
Exemption was obtained by the University of Chicago Institutional Review Board (IRB) for the program survey. The 3 institutions that evaluated their de-identified ILP data obtained IRB exemption. Interns were not asked for consent for the analysis as it posed minimal risk.
Results
Overall, 52 IM residency programs agreed to participate with a post-pilot survey response rate of 87% (45 of 52). Of 45 programs, 40 (89%) collected ILPs, while others described unique program circumstances preventing them from collecting ILPs. Thus, we report on data from the 40 responding programs that collected ILPs in the pilot. Over half of respondents represented university-based programs (58%, 23 of 40), and 42% represented community-based programs (17 of 40). The mean number of interns per program enrolled was 24±17 and who completed ILPs was 20±15. Survey responses of participating programs indicated that their programs comprised 995 total interns with 782 of them completing ILPs (79%). Program perspectives regarding barriers faced by incoming interns as well as the use, impact, and value of the ILP are detailed in Table 1. When asked about effort required, 45% of respondents (18 of 40) reported slight effort and only 5% (2 of 40) reported great or extreme effort. The methods most advocated to reduce workload were electronic forms (58%, 26 of 45). Only 29% (13 of 45) thought additional administrative support would help. When asked what should be removed from the ILP, 71% of respondents (32 of 45) selected nothing, 22% (10 of 45) selected desired learning experiences, and 11% (5 of 45) selected core IM topic preparedness.
Overall, in the intern ILP review, 111 ILPs were analyzed with return rates of 100% (17 of 17), 75% (27 of 36), and 100% (67 of 67) from each site. This represented approximately 14% (111 of 782) of the total intern population completing ILPs. Demographics were 84% (93 of 111) US allopathic graduates, 9% (10 of 111) international medical graduates, and 7% (8 of 111) US osteopathic graduates, as well as 58% (64 of 111) female and 20% (22 of 111) underrepresented in medicine. On average, the ILP took interns 29.2±14.9 minutes and 21.6±10.3 minutes for faculty mentors to complete as obtained from time reported by these individuals on ILP forms. Faculty mentors signed 80% (89 of 111) of ILPs analyzed. The themes identified in preparedness data are displayed in Table 2.
Discussion
Overall, residency program leaders found the ILP valuable without excessive workload. Unlike the pediatrics UME ILP pilot, our ILP was based entirely on self-reported data, was driven by learners, and included areas for improvement.8 In contrast, the workload was manageable and was done without additional funding. We acknowledge that the self-reported ILP design may be limited by the Dunning-Kruger effect and how adequately interns can self-assess.16 Future studies comparing self-reported and objective ILP data with competency data are needed.
Limitations of our study include the underrepresented population of osteopathic and international medical graduates, which may reduce generalizability and introduce selection bias. Our evaluation by programs was limited to self-reported data, and we did not study intern perceptions directly but rather program perceptions of barriers and impact on interns. However, future assessment of intern perceptions of the ILP would be more reliable. In the future, longer term use for coaching and evaluation of whether the ILP helped residents improve learning outcomes should be studied.
Conclusions
IM residency programs found the ILP feasible and valuable to understand interns’ competencies and needs for improvement.
References
Editor’s Note
The online version of this article contains the survey used in the study.
Author Notes
Funding: The authors report no external funding source for this study.
Conflict of interest: The authors declare they have no competing interests.
Preliminary analysis from this project was presented as a poster at the Alliance for Academic Internal Medicine Internal Medicine Week, April 2-5, 2023, Austin, TX, and the Society of General Internal Medicine Meeting, May 10-13, 2023, Aurora, CO, and as an oral presentation at the Society for Hospital Medicine Converge Conference, March 27-29, 2023, Philadelphia, PA.
The authors would like to acknowledge the faculty, administrators, and residents who participated in this pilot, as well as Karen Ward, Charishma Boppana, MD, Tripti Singh, MD, and the AAIM Learner Handoffs Standards Task Force.
Disclaimer: The views expressed in the article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the US Government.