Background The process for program directors (PDs) to provide feedback to medical schools about their graduates’ readiness for postgraduate year 1 (PGY-1) training is burdensome and does not generate national benchmarking data.
Objective The Association of American Medical Colleges (AAMC) tested the feasibility of administering a standardized Resident Readiness Survey (RRS) to PDs nationally about their PGY-1 residents’ preparedness for residency.
Methods In 2020 and 2021, the AAMC invited PDs via email to complete RRSs for their PGY-1s who graduated from participating schools; the AAMC provided schools with reports of identified RRS data for their graduates. Outcome measures included school participation rates, PD response rates, PGY-1s’ coverage rates (RRSs completed/RRSs PDs invited to complete), RRS completion time (time-stamp difference: RRS opening–submission), and participating schools’ feedback about the process collected via AAMC evaluation questionnaires sent to school leaders. Chi-square tested significance of differences between proportions.
Results School participation increased from 43.8% (2020: 77 of 176) to 72.4% (2021: 131 of 181). PD response rates, similar in both years overall (2020: 1786 of 2847 [62.7%]; 2021: 2107 of 3406 [61.9%]; P=.48), varied by specialty (P<.001; range 65 of 154 [42.2%], neurology; 109 of 136 [80.1%], internal medicine-pediatrics, both years combined). PGY-1s’ coverage rates were similar (P=.21) in 2020 (5567 of 10 712 [52.0%]) and 2021 (9688 of 18 372 [52.7%]). RRS completion time averaged less than 3 minutes. Numerous school leaders reported that RRS data stimulated new, or supported ongoing, curricular quality improvement.
Conclusions Administration of a standardized RRS to PDs nationally about PGY-1s preparedness for residency is feasible and will continue.
Introduction
A standardized Resident Readiness Survey (RRS) process for program directors (PDs) to provide feedback to US MD- and DO-degree granting medical schools about graduates’ preparedness for graduate medical education (GME) was developed and tested nationally. This innovation has the potential to benefit GME programs and medical schools, and ultimately the public at large, because collected data can inform interventions to optimize the undergraduate medical education (UME) to GME transition for all postgraduate year 1 (PGY-1) residents.
Nationally, PDs across a range of specialties have reported relatively low levels of confidence in incoming PGY-1 residents’ readiness for some of the tasks they may assume at the start of residency training.1–3 However, mechanisms for PDs to provide feedback to schools about their graduates have been fragmented, driven by school-specific initiatives to meet accreditation standards pertaining to evaluation of educational program outcomes.4,5 PDs have received different surveys at various times from medical schools attended by their PGY-1 residents, and schools have been unable to “benchmark” information about their graduates collected via their school-specific surveys against national data to inform UME curricular quality improvement (QI). Furthermore, PDs may have different specialty-specific expectations of their incoming PGY-1 residents.6,7 However, specialty-specific data about PGY-1 residents’ readiness for the transition to residency, which could inform specialty organizations’ efforts to optimize graduates’ preparedness, have not been available.8,9 The Coalition for Physician Accountability (CoPA) UME-GME Review Committee (UGRC) recognized this gap, recommending that, “Early and ongoing specialty-specific resident assessment data should be automatically fed back to medical schools through a standardized process to enhance accountability and to inform continuous improvement of UME programs and learner handovers.”10 The objective of our innovation was to develop and test the feasibility of a national process for PDs to provide standardized feedback to medical schools about the readiness of their graduates for PGY-1 training.
KEY POINTS
Medical schools lack, but greatly desire, feedback on how their graduates perform during residency in order to perform continuous quality improvement on their curricula.
The AAMC Resident Readiness Survey feasibly provided many medical schools with residency preparedness data, and medical school leaders offered favorable feedback.
This initiative holds potential to expand the amount of feedback given to medical schools about their graduates’ preparedness for residency.
Methods
A standardized RRS instrument “version 1.0” (provided as online supplementary data) was developed through an iterative process that broadly engaged the UME and GME communities (see online supplementary data for details). A national process for RRS administration, data collection, and provision of collected data back to medical schools on an identified basis was developed and tested by the Association of American Medical Colleges (AAMC) in 2 pilot years, in cooperation with the American Association of Colleges of Osteopathic Medicine. Pilot Year (Y) 1 of the RRS process is described as follows.
In the 2020 AAMC Electronic Residency Application Service (ERAS) cycle, updated applicant information included notification that, for medical schools to improve the UME to GME transition, assessment information would be collected from PDs about their GME residents; ERAS applicants were able to opt out of this process.
In February 2020, voluntary participation in the RRS process was opened to all US MD-degree- and DO-degree-granting medical schools with 2019-2020 graduating classes via invitations to the senior medical education and student affairs leaders at each school. Schools that chose to participate signed agreements with the AAMC regarding data confidentiality and RRS data use limitations and agreed not to send school-specific surveys to PDs about their graduates in PGY-1 positions in 2020-2021.11 In November 2020, participating schools’ 2019-2020 graduates in PGY-1 training and their PDs were identified in the AAMC GME Track (a secure resident database and tracking system to assist GME administrators and PDs in GME data collection and management12 ), which is populated by data submitted voluntarily by Accreditation Council for Graduate Medical Education (ACGME)-accredited programs about their programs and residents through the National GME Census.13 Then (also in November 2020), PDs with any eligible PGY-1 residents (“eligible” PGY-1 residents included participating schools’ 2019-2020 graduates with GME Track and ERAS acknowledgement records) were invited via email to complete, within GME Track, one RRS for each eligible PGY-1 resident, with periodic reminders about outstanding RRSs sent through March 2021. The AAMC then provided school-specific reports of identified RRS data to MD-degree-granting school designees via the AAMC Medical School Profile System portal and to DO-degree-granting school designees via secure email. Y1 process feedback was obtained from participating schools via an evaluation questionnaire in fall 2021 (online supplementary data). An aggregate summary of Y1 RRS process outcomes and data, stratified by specialty, was also released.14 A similar process was followed for Y2-participating schools’ 2020-2021 graduates with an aggregate Y2 summary released.15
Outcomes Measured
The outcomes measured included program-level outcomes (medical school participation and PD response rates); RRS completion data (eg, completion rates, average completion time); PGY-1 residents’ coverage outcomes (percentages of eligible PGY-1 residents among all participating schools’ graduates and percentages of “covered” PGY-1 residents [those for whom RRSs were completed] among all eligible PGY-1 residents); and participating schools’ feedback about the RRS process and utility of RRS data.
Data Analysis
The RRS dataset, de-identified for analysis, included individual-level data for all eligible PGY-1 residents and for all PDs with at least one eligible PGY-1 resident in their programs. Tests on the equality of proportions were used to determine significant differences between proportions, and analysis of variance tested the significance of differences between means, reporting 2-sided P values and 95% confidence intervals (CIs).
The AAMC Human Subjects Office determined this study to be exempt from further Institutional Review Board review, as defined in 45 CFR 46, since this study uses de-identified data.
Results
Program-Level Outcomes
As shown in Table 1, the school participation rate increased from Y1 (77 of 176, 43.8%) to Y2 (131 of 181, 72.4%) by 28.6 percentage points (95% CI 18.8-38.4; P<.001). Similar percentages of invited PDs responded in Y1 (1786 of 2847, 62.7%) and Y2 (2107 of 3406, 61.9%; P=.48). Across both years combined, PD response rates varied by the number of eligible PGY-1 residents in their programs (from 58.1% [883 of 1520] for PDs with 1 eligible resident to 65.5% [413 of 631] for PDs with 10 or more eligible residents; P=.001) and by specialty (P<.001), from 42.2% (65 of 154, neurology) to 80.1% (109 of 136, internal medicine-pediatrics).
RRS Completion Data
Responding PDs’ RRS completion rates (ie, % of all RRSs sent to the PD that were completed) were high in both years (mean % [SD]: Y1=92.0 [23.1]; Y2=90.6 [24.6]; P=.07; online supplementary data). Among PDs receiving 10 or more RRSs, the mean percentage completed increased by 15 percentage points (95% CI 7.1-22.9), from 66.9% [43%] among 122 PDs in Y1 to 81.9% [34.4%] among 291 PDs in Y2 (P<.001). RRS average completion time was less than 3 minutes per survey in both years.
PGY-1 Residents’ Coverage Outcomes
The percentage of eligible PGY-1 residents among all participating schools’ graduates increased by 2.7 percentage points from Y1 to Y2 (89.0% [10 712 of 12 035] vs 91.7% [18 372 of 20 044]; P<.001; online supplementary data). The percentage of covered PGY-1 residents (PGY-1 residents for whom RRSs were completed out of all eligible PGY-1 residents) was similar in Y1 and Y2 (52.0% [5567 of 10 712] vs 52.7% [9688 of 18 372; P>.05). Across both years, PGY-1 residents’ coverage varied by specialty (P<.001), ranging from 36.4% (155 of 426, neurology) to 68.2% (270 of 396, urology). The per school PGY-1 residents’ coverage percentage was similar in Y1 and Y2 (mean % [SD]: 52.0% [7.9], 77 schools vs 53.5% [7.0], 131 schools; P=.16).
Participating Schools’ Feedback
Evaluation questionnaire response rates were similar (P=.38) in Y1 (68 of 77 [88.3%]) and Y2 (109 of 131 [83.2%]). As shown in Table 2, item response distributions in Y1 and Y2 were also similar. Across both years, 10.8% (17 of 158) of respondents indicated that their school’s RRS report had stimulated new curricular QI efforts and 72.8% (115 of 158) indicated that it had supported ongoing curricular QI efforts. Among schools that had previously sent local surveys to PDs, 62.1% (87 of 140) reported PD response rates of 50% or less. Shown in the Box are examples of initial uses of RRS data reported by participating schools.
“[College] tracks the RRS data over time and shares the results widely. Many of the questions have been mapped to EPOs. We have also modified our clerkship assessments to align with some of the items tracked in the RRS.”
“We utilize this survey to direct CQI of our curriculum by identifying areas of weakness in our graduates’ level of preparedness. Example, if patient handoffs are weak among multiple graduates, we incorporate more opportunities to practice into the clerkship requirements.”
“…the RRS…obtains a higher response rate than we had gotten in the past internally, and the qualitative data are more helpful and specific. Currently, we are sharing the data with our curriculum committees. We created a comparison of our school vs national and a comparison of our school’s results from last year to this year. Then, I plug the data into our longitudinal tracking system that shows student performance from admissions through residency to look for any patterns. Also, we are using the RRS data to consider changes in our transition to residency courses.”
“What I find the most useful are the narrative comments…the comments about individual students are illuminating… We use the data to look at overall program outcomes, we share them broadly with student affairs, curriculum, and our larger leadership team.”
“We mapped the survey questions to our program objectives and then aggregated responses accordingly. We present the data to our curriculum committee for review and recommendations…but they were waiting to see an additional year of data next year on some areas.”
“We have already created a dashboard-style report with the national data and our school’s data and shared it with our curriculum committee. We are currently changing our [medical education] program so [will] continue to monitor in the new program to ensure our graduates are not rated lower than graduates in our legacy curriculum.”
“We share the anonymized data with our curriculum committee. The data has reassured us that our curriculum is working well for our students.”
“We found the RRS data to be very helpful! We presented the one major question about our grads’ overall performance in the 6-month transition from UME to GME at our annual curriculum retreat… We just changed our entire curriculum so our class of 2024 (and hence RRS data from 2025) is likely to be the one that will impact any further changes as they were the first class to experience our revised curriculum. Our class of 2023 was our last legacy class but was our first class to have a graduation requirement of a Transition to Residency course (including general content and all of our clinical departments offering specialty specific content), so I do look forward to seeing the 2024 RRS data as well…”
“We have definitely used this data [at medical school x] with our curriculum committee, our academic progress committee, and have even used the data with our admissions committee. It has been exceedingly helpful as we review our competencies—where we have gaps, where we seem to be doing well. For those students [PGY-1 residents] who did not continue in residency, this has also been invaluable data.”
“Have used the report in a variety of ways: We have looked retrospectively at students that did not meet expectations in any of the competencies as well as overall, looking for signals that we may have missed and could have addressed prior to graduation… We identified the competencies that our students in aggregate underperformed in (less students exceeding expectations than nationally, or more students failing to meet expectations than nationally). This has been shared with the curriculum committee, and consideration is being given on how to improve training/assessment in those areas […] This has been the most helpful data we have yet seen on our graduates…”
“We are considering if our MSPE letter writers need to place more openly professionalism concerns that we have had with a learner.”
“We are currently reviewing the content of our MSPEs, and this information from program directors will certainly inform our ongoing conversations.”
“The items mapped very closely to our program learning objectives, which made it an excellent tool for program evaluation.”
“This is a great means to get feedback on how your students fared in residency. We just had our [accrediting body site visit] review this fall and the data was worthwhile for our DCI preparation.”
“We found the initial reports with national comparisons to be useful feedback to our college’s administrators and instructors. The scale of exceeds expectations, meets expectations, or does not meet expectations was useful. As an internal quality assurance review, we will be correlating results with UME performance.”
“It was eye-opening to see when students of concern were appropriately identified and even more eye-opening when a learner we did not have on our radar was reported to be having difficulties. It allowed us to go back and consider any flags we may have missed that were not communicated in the MSPE.”
“The results in the RRS are really, really helpful. While we used to send out a similar survey to PDs, we had a much lower response rate. Probably more importantly, we did not have any way to benchmark our results. The national results are the key to helping us interpret this data. And, it was extremely helpful to get the data by student, since that allowed us to look at any student who scored a ‘did not meet’ and do CQI on our end.”
“We found the information useful in refining our transition to residency program as well as our doctoring coursework offerings. Additionally, it helped us to reinforce processes we had in place for student success.”
“The information in this survey is very helpful to our curriculum committee and course directors as we plan the next year’s curriculum. It provides more information than we have been receiving from our self-administered program director surveys with a higher response rate. We also plan to use the information to go back and analyze our admissions and promotions processes.”
Discussion
The feasibility of implementing a national process for PDs to provide standardized feedback to medical schools about the preparedness of their graduates for PGY-1 training was demonstrated over 2 years. Overall PD response rates and RRS completion rates suggest that the process worked well for many PDs; the process was also positively evaluated by participating medical schools.
Overall RRS PD response rates compared favorably with others’ experiences. Cooper et al reported a 20% response rate to a survey sent to PDs in 4 specialties, and the National Resident Matching Program reported less than 35% response rates for surveys sent to PDs in 2021 to 2022 about virtual recruitment experiences.16,17 Recently reported response rates for specialty-specific PD surveys ranged from 42% for general surgery18 to 57% for internal medicine.6
Among responding PDs, differences we observed in PD response rates and RRS completion rates by the number of RRSs received suggest that PDs of larger programs may particularly appreciate the value of the RRS process but may also face challenges in completing the large numbers of RRSs received. The significant increase in the RRS completion rate in Y2 compared to Y1 among responding PDs who received 10 or more RRSs is notable in this context. Ongoing PD community engagement to incorporate their feedback for process improvements, demonstrate the utility of information provided by PDs to medical schools, and explore the value of national benchmarking data to specialty organizations working to improve the transition to residency may be key to sustaining and potentially even increasing PD response rates and RRS completion rates.
Importantly, the RRS is only one way of sharing information about the transition to residency. For a truly productive 2-way UME-GME system of communication, expansion of information shared “up” from UME to GME—beyond the Medical School Performance Evaluation—is also needed. One approach, aligned with another recommendation by the CoPA UGRC,10 is the educational handover. This post-Match UME-to-GME communication, about the incoming PGY-1 resident’s current strengths and areas for improvement (also referred to as a “warm handover”), could inform individualized learning plans at the start of GME19,20 and warrants similar consideration of a national approach.
The RRS, as a standardized survey about all PGY-1 residents, may include some items that PDs do not consider particularly relevant to their specialty and may not include others that PDs would consider highly relevant to their specialty. Periodic review and refinement of the RRS, with input from PDs across specialties to ensure the items are meaningful and maintain relevance across specialties, will be important. Furthermore, the PD perspective has not necessarily been routinely incorporated into UME curriculum revision processes21 ; the RRS can provide a means to collect PD feedback on UME curricular strengths and areas for improvement to inform both school-level and national curriculum QI efforts.
Although international medical school graduates (IMGs) currently comprise about 23% of residents in US ACGME-accredited GME programs,22 participation was not open to international medical schools due to resource limitations. Establishment of an analogous RRS process for IMGs could further improve the preparedness of all PGY-1 residents entering ACGME-accredited GME programs.
Finally, validity evidence for the RRS must be considered. Of 5 sources of validity evidence, initially described by Messick and since widely adopted,23 there is some content validity evidence for the RRS (as further described in the Resident Readiness Survey Instrument Development document provided as online supplementary data, and through the collection of evaluation data from participating schools regarding RRS content and utility), but sources of evidence regarding response processes, internal structure, relations to other variables, and consequences of the RRS remain to be fully examined. Based on the results of the Pilot Y1 and Y2, the AAMC will continue to administer the RRS on an annual basis.
Conclusions
It is feasible to administer a standardized survey to PDs nationally about the readiness of their PGY-1 residents. School participation substantially increased from Y1 to Y2, with most of the invited schools participating in Y2. Response rates among invited PDs were generally high, as were RRS completion rates among responding PDs.
The authors would like to thank the following individuals, all full-time employees of the Association of American Medical Colleges (AAMC), who served on the AAMC’s Resident Readiness Survey Workgroup: Marie Caulfield (retired), Manager, Data Operations and Services; Virginia Bush, Project Management Specialist, Academic Affairs; Keith Horvath, Senior Director, Health Care Affairs; Mallory Lee, Business Operations Specialist, Academic Affairs; Cecilia Barry, Senior Corporate Counsel, Legal Services; Lynn Shaull, Senior Research Analyst, Academic Affairs, Amy Addams, Director, Student Affairs Strategy and Alignment, Academic Affairs and Erin Helbling, Senior Specialist, Services. They also thank Dolores Mullikin, Tripler Army Medical Center Clerkship Director and Uniformed Services University health professions education doctoral candidate, for her service on the workgroup, and Whitney Staiger, Business Operations Specialist, AAMC for her assistance.
References
Editor’s Note
The online version of this article contains the AAMC Resident Readiness Survey, survey instrument development, task force roster, AAMC Resident Readiness Pilot Evaluation Survey, Resident Readiness Survey completion data, and postgraduate year 1 residents’ coverage outcomes.
Author Notes
Funding: The authors report no external funding source for this study.
Conflict of interest: Lisa Howley, PhD, Douglas Grbic, PhD, Amy Jayas, MPH, Lindsay B. Roskovensky, BA, and Dorothy A. Andriole, MD, are full-time employees of the Association of American Medical Colleges (AAMC) and also served on the AAMC’s internal Resident Readiness Survey Workgroup. Mark R. Speicher, PhD, MHA, is a full-time employee of the American Association of Colleges of Osteopathic Medicine, whose member Colleges of Osteopathic Medicine participate in the Resident Readiness Survey process.
An abstract of this work was presented at the Association of American Medical Colleges’ Learn Serve Lead meeting, November 11-15, 2022, Nashville, TN, and published; Grbic D, Andriole DA, Mullikin D, Howley LD. Program directors’ assessment of postgraduate year 1 residents’ readiness for graduate medical education: lessons and insights from the Association of American Medical Colleges inaugural 2020-2021 resident readiness survey. Acad Med. 2022;97(suppl 11):128. doi:10.1097/ACM.0000000000004888