Background Since 2020, virtual interviews have become the typical way in which applicants assess residency programs. It is unknown whether the change from in-person to virtual interviews has been associated with changes in perceptions of the quality of information gathered by prospective applicants.

Objective To ascertain perspectives on the satisfaction with, quality of, and accuracy of information gathered by internal medicine (IM) residency applicants from virtual and in-person interviews.

Methods Twenty-nine thousand, seven hundred and seventy-six residents from US and Puerto Rico residency programs sitting for the 2022 American College of Physicians Internal Medicine In-Training Examination (IM-ITE) were surveyed. An optional, 5-question survey was administered at the end of the examination. Responses were analyzed based on interview format—virtual (postgraduate year [PGY]-1-2) or in-person (PGY-3)—and PGY.

Results Of 29 776, 23 161 residents responded to the survey (77.8% response rate). Regardless of PGY, respondents reported a high degree of satisfaction with the quality of information gathered from their interview day, though there was a statistically significant difference between virtual and in-person [somewhat/very satisfied: In-person 5938 of 7410 (80.1%); 95% CI [79.2, 81.0] vs virtual 12 070 of 15 751 (76.6%); 95% CI [76.0, 77.3]:P<.001]. Residents in all PGYs reported sessions with residents and one-on-one interviews as the most important factors when creating their rank lists.

Conclusions We found differences in satisfaction and perceptions of the quality of information gathered between IM residents who participated in virtual and in-person interviews. However, regardless of format, most respondents reported satisfaction with their interview experience.

The COVID-19 pandemic forced graduate medical education (GME) residency programs to adopt virtual interviews.1  Although the COVID-19 Public Health Emergency has drawn to a close, programs have largely continued interviewing applicants in a virtual format as recommended by national organizations such as the Association of American Medical Colleges.2,3  This change has been associated with continued application inflation,4  but has also likely led to lower costs for applying to residency5  and has reduced the carbon footprint of the residency Match.6 

Importantly, it is unknown what effect this shift has had on the quality of information prospective applicants gleaned from the interview day experience. Compared with in-person interviews, virtual interviews may reduce the quality of information provided to applicants during interviews, potentially due to fewer opportunities to meet current residents, lacking a sense of place and community because of a lack of physical presence on the interview day, or obtaining less information.7  Importantly, virtual communications may dilute the richness of interpersonal interactions, reducing the amount of feedback and quality of interpersonal encounters.8 

In 2022, residents in internal medicine (IM) programs represented cohorts that participated in different interview formats—postgraduate year (PGY) 3 residents attended in-person interviews, while PGY-1 and PGY-2 residents participated in virtual interviews. Furthermore, as the COVID-19 pandemic evolved, virtual interviews evolved as well. As programs became more accustomed to presenting themselves in a virtual format, applicants became better prepared to engage in information gathering in this way. Thus, the year 2022 presented a unique opportunity to evaluate perceived differences in interview day quality based on which application year current residents interviewed.

A national survey of residents across PGYs regarding the perceived utility of these 2 interview formats would provide crucial information to residency programs as they decide how to conduct their recruitment, and to national organizations who make recommendations about interview formats. To this end, we conducted a national survey of IM residents to investigate perceptions of the quality of information gleaned from the interview day experience among residents who participated in in-person interviews (PGY-3 residents) compared to those who participated in virtual interviews (PGY-1 and 2). We hypothesized that there would be differences between respondents based on year, which might be attributable to differences in interview format.

What Is Known

Virtual interviews have been shown to be feasible, yet it is not clear how satisfactory they are to applicants as they search for high-quality information about potential programs.

What Is New

This survey of more than 23 000 internal medicine residents reports a high degree of satisfaction with and confidence in the information they gathered from both in-person and virtual residency interviews.

Bottom Line

Program directors and specialty societies making decisions about interview formats should consider these perspectives from a large cohort group.

The Internal Medicine In-Training Examination (IM-ITE) is a national, multiple-choice examination developed and owned by the American College of Physicians (ACP) and administered annually to IM residents by the National Board of Medical Examiners. The ACP allows questions to be proposed, and included if accepted, as an optional survey at the end of the IM-ITE. The participants for our study were the population of IM residents who sat for the IM-ITE during the 2022 calendar year. Participants received no incentive for survey completion. The 2022 IM-ITE survey opened on August 18, 2022, and closed on September 7, 2022.

Questions in the 2022 Resident Survey, administered at the conclusion of the IM-ITE, were developed and submitted by the authors as members of the Clerkship Directors in Internal Medicine Survey and Scholarship Committee. In accordance with published guidelines, committee members (B.L.H., T.A.R., I.A., A.R.W.) drafted and revised questions via a collaborative iterative process beginning with the research question and hypothesis.9,10  These 5 questions asked participants about their satisfaction with information gathered from interview day experiences and their confidence level in selecting a residency program based on that information. We also asked whether their current residency program’s culture was accurately portrayed on their interview day and whether they successfully gathered information about the program at which they matched. We included 5-point Likert scale and single-best-answer response questions (online supplementary data). The optional confidential survey was conducted at the conclusion of the IM-ITE examination. Survey data was collected and stored independently of examination data; a 7-digit ID number was used to link respondent demographic data provided by the National Board of Medical Examiners to the respondent dataset.

We analyzed survey data from PGY-1, 2, and 3 residents who gave permission for their data to be used in scholarly publications. We collapsed Likert scale categories (eg, very unsatisfied and somewhat unsatisfied) together for statistical analysis. We conducted analyses between PGY and between those who completed in-person (PGY-3) and virtual interviews (PGY-1 and 2). We used Pearson’s chi-square test of independence to analyze group comparisons. We analyzed questions using the overall 5-point Likert scale and by collapsing responses into 3 categories. We conducted limited secondary analyses to measure potential differences in demographic variables and interview day perceptions. Finally, to examine the effect of possible covariates, we dichotomized results by “satisfied/very satisfied” compared with “unsatisfied/very unsatisfied” or “neutral” and conducted a multivariable logistic regression comparing the effect of in-person and virtual interviews on overall interview satisfaction. We used α=0.05 as the threshold for statistical significance for all comparisons. Data were analyzed in Q Professional (5.16.2.0). This study was approved by the Northwestern University Institutional Review Board.

Overall, 29 776 PGY-1 to PGY-3 individuals from US and Puerto Rico residency programs sat for the 2022 IM-ITE. Of these, 23 161 completed the survey and permitted their data to be used, resulting in a response rate of 77.8%. Demographic information among respondents is presented in Table 1. There was an even distribution among PGYs. Fifty-five percent of respondents self-identified as male. The majority of respondents attended US medical schools. In presenting our data, we found that collapsing categories provided the same overall conclusions and presented a clearer picture of the results.

Table 1

Demographic Information for Participants in the 2022 American College of Physicians Internal Medicine In-Training Examination Supplemental Questionnaire

Demographic Information for Participants in the 2022 American College of Physicians Internal Medicine In-Training Examination Supplemental Questionnaire
Demographic Information for Participants in the 2022 American College of Physicians Internal Medicine In-Training Examination Supplemental Questionnaire

Regardless of format, residents overall were satisfied with the information gathered from interview day experiences and were confident in their ability to choose between residencies based on the information they gathered (Figure 1). When grouping those who did virtual interviews together (PGY-1 and PGY-2), residents who completed in-person interviews were more satisfied with the information they gathered (in-person 5938 of 7410 [80.1%]; 95% CI [79.2, 81.0] vs virtual 12 070 of 15 751 [76.6%]; 95% CI [76.0, 77.3]; P<.001) and, conversely, virtual interviewees were more dissatisfied (in-person 404 of 7410 [5.5%]; 95% CI [5.0, 6.0] vs virtual 1429 of 15 751 [9.1%]; 95% CI [8.6, 9.5]; P<.001). In-person interviewees were more likely to report confidence in their ability to choose amongst residency programs based on interview day information (in-person 5968 of 7410 [80.5%]; 95% CI [79.6, 81.4] vs virtual 11 888 of 15 751 [75.5%]; 95% CI [74.8, 76.1]; P<.001). When asked specifically about their current program, residents from all years reported success in gathering accurate information about their program (Figure 1). When comparing in-person and virtual interviews, a greater proportion of in-person interviewees reported being more successful in gathering information about their current residency program (in-person 5979 of 7410 [80.7%]; 95% CI [79.8, 81.6] vs virtual 12 365 of 15 751 [78.5%]; 95% CI [77.9, 79.1]; P<.001). However, more virtual interviewees reported that their residency program’s culture was accurately portrayed during the interview day (in-person 5917 of 7410 [79.9%]; 95% CI [78.9, 80.7] vs virtual 13 076 of 15 751 [83%]; 95% CI [82.4, 83.6]; P<.001).

Figure 1

Responses to 2022 Resident Survey Questions Regarding Satisfaction With, Confidence in, and Quality of Information Gathered During Interview Days, Organized by Virtual vs In-Person Interviews

Figure 1

Responses to 2022 Resident Survey Questions Regarding Satisfaction With, Confidence in, and Quality of Information Gathered During Interview Days, Organized by Virtual vs In-Person Interviews

Close modal

When analyzing by PGY, respondents in all years reported a high degree of overall satisfaction with and confidence in the information gleaned from residency interviews (Figure 2). When asked about satisfaction with and confidence in information gathered from interviews, both PGY-1 and PGY-2 respondents reported more dissatisfaction than PGY-3 respondents, though respondents were more dissatisfied/unconfident in the PGY-2 class when compared to PGY-1 (Figure 2). Conversely, more respondents from the PGY-1 class reported that their current residency program’s culture was accurately portrayed on interview day, compared to PGY-2 and 3 classes.

Figure 2

Responses to 2022 Resident Survey Questions Regarding Satisfaction With, Confidence in, and Quality of Information Gathered During Interview Days, Organized by PGY Year

Abbreviation: PGY, postgraduate year.

Note: Percentages may not equal 100% due to rounding.

a Denotes statistical significance at P<.05 level between the specified group and PGY-3 respondents.

b Denotes statistical significance at P<.05 level between the specified group and PGY-1 respondents.

c Denotes statistical significance at P<.05 level between the specified group and PGY-2 respondents.

Figure 2

Responses to 2022 Resident Survey Questions Regarding Satisfaction With, Confidence in, and Quality of Information Gathered During Interview Days, Organized by PGY Year

Abbreviation: PGY, postgraduate year.

Note: Percentages may not equal 100% due to rounding.

a Denotes statistical significance at P<.05 level between the specified group and PGY-3 respondents.

b Denotes statistical significance at P<.05 level between the specified group and PGY-1 respondents.

c Denotes statistical significance at P<.05 level between the specified group and PGY-2 respondents.

Close modal

Respondents from all class years reported interview day sessions with residents, one-on-one interviews, and conversations with peers, mentors, or advisors as the most important factors in creating an informed rank list (Table 2). Residents who completed in-person interviews were more likely to report sessions with residents to be their most important factor, compared to virtual interviewees. Conversely, more virtual interviewees reported introductory materials as their most important factor; the difference was small but statistically significant (Table 2). Additionally, more residents attending virtual interviews reported information from online sources as their most important resource: both those from residency programs (in-person 126 of 7410 [2%]; 95% CI [1.4, 2.0] vs virtual 580 of 15 751 [4%]; 95% CI [3.4, 4.0]; P<.001) and those from other online sources (in-person 126 of 7410 [2%]; 95% CI [1.4, 2.0] vs virtual 501 of 15 751 [3%]; 95% CI [2.9, 3.5]; P<.001).

Table 2

Participant Responses to the 2022 IM-ITE Question, “Which of the Following Factors Was the Most Important in Helping You Make an Informed Rank List?”

Participant Responses to the 2022 IM-ITE Question, “Which of the Following Factors Was the Most Important in Helping You Make an Informed Rank List?”
Participant Responses to the 2022 IM-ITE Question, “Which of the Following Factors Was the Most Important in Helping You Make an Informed Rank List?”

Residents identifying as female reported more satisfaction with information gathered from interview day experiences (female 8400 of 10 454 [80%]; 95% CI [79.6, 81.1] vs male 9582 of 12 666 [76%]; 95% CI [74.9, 76.4]; P<.001) and more confidence in the information gathered (female 8255 of 10 454 [79%]; 95% CI [78.2, 79.7] vs male 9574 of 12 666 [76%]; 95% CI [74.8, 76.3]; P<.001). Respondents from non-US medical schools reported more satisfaction than residents who attended US medical schools, though both groups reported a high degree of satisfaction overall (international medical graduate [IMG] 6320 of 7940 [80%]; 95% CI [78.7, 80.5] vs US medical graduate 11 687 of 15 219 [77%]; 95% CI [76.1, 77.5]; P<.001). Finally, we constructed a multivariable logistic regression model to determine the effect of covariates on interview satisfaction. In an unadjusted analysis, we found a statistically significant difference between virtual and in-person interviews (OR 0.81, P<.001), noting lower satisfaction for virtual interviews, as we found above. This difference was unchanged with the addition to the model of all covariates listed in Table 1 (UMSGs, self-identified gender, primary language, or program type) (OR for satisfaction in multivariable model 0.81, P<.001). In this analysis, we found that being an IMG (OR 1.12), female gender (OR 1.30), and primary English speaker (OR 1.05) were all statistically associated significantly with greater interview day satisfaction (P<.001 for each).

This national survey of IM residency applicants found statistically significant differences in residents’ perceptions of their interview day experiences, depending on whether respondents conducted virtual or in-person interviews. However, we believe these differences are subtle and likely achieved statistical significance due to the large sample size. These results are consistent with other recent studies from various GME subspecialties, demonstrating a high degree of satisfaction overall among learners at multiple levels who completed virtual interviews.11,12  Additionally, we found higher rates of satisfaction with and confidence in information gained during interviews by PGY-1 respondents compared to PGY-2 respondents. This may suggest increasing comfort level with the virtual interview format by applicants, increasing proficiency in residency programs’ ability to communicate information virtually, or both.

While residency programs adopted virtual interviews quickly in response to COVID-19, this interview format has demonstrated durability even as other facets of life returned to pre-pandemic norms.13  GME programs and prospective trainees alike have perceived benefits and drawbacks of virtual interviews, though comparative studies have been small single-institution analyses.11  Previous studies across GME specialties demonstrate the tradeoffs created by virtual interviews, with lower costs counterbalanced with issues related to possible application inflation and inequitable distributions of interview offers.14,15  While virtual interviews may help with some aspects of equity, particularly benefitting applicants without the financial means to travel frequently, the phenomenon of “application hoarding” becomes easier in a virtual interview format, exacerbating other inequities between program applicants.1618  Additionally, virtual interviews may impede applicants from accurately experiencing each residency’s training environment. However, in this study we found that a greater proportion of virtual interviewees agreed that their current program’s culture was accurately portrayed, suggesting that many were able to obtain at least some information regarding a program’s culture in this format. While these factors should play important roles in determining interview formats in future Match cycles, this study adds to the growing body of evidence that applicants perceive virtual interview formats as an “applicant-centered” approach to the selection process.

Respondents reported using information from their peers, current program residents, and their advisors to make important decisions about how to rank programs. A 2021 survey of applicant perspectives on virtual interviews found that interactions with residents, in particular opportunities to observe program culture, were highly valued even in a virtual format.19  In this and in our analysis, students reported slightly less satisfaction in online resident interactions, compared with in-person. There are many possible reasons for this finding, including fewer one-on-one encounters with residents (such as meals before or during the interview day), lack of opportunity for informal interactions that would have occurred during down time on an in-person interview day, or fewer opportunities to ask in-depth or candid questions in a virtual format. However, a plurality of respondents noted sessions with residents as a critical method for making their rank list, likely as a surrogate for estimating their fit within a particular residency program’s culture; thus, many virtual applicants were still able to use information from conversations with residents in a productive manner. While applicants may have used noncurated online resources (eg, discussion boards) in the process of information-gathering during the COVID-19 pandemic, when asked about their most important resource, respondents in our study overwhelmingly favored more traditional information sources, such as discussions with medical school advisors, conversations during interviews, and discussions with peers. This is consistent with prior studies demonstrating the relative lack of informational depth and variable quality associated with many online materials, including program websites and the Fellowship and Residency Electronic Interactive Database (FREIDA).20,21  Future research is needed to further clarify applicant perceptions regarding information sources used to make informed residency program selections.

We have several limitations to note. As we asked participants about events from their past, survey responses may be subject to recall bias; reported perceptions for each cohort of residents may have been impacted by the length of time between answering our questionnaire and interview year. As neither group experienced interviews in an alternative format, it is possible that our results are due less to competent judgment and more to cognitive bias. The accuracy of residents’ perceptions cannot be determined using our methodology; however, in the process of program ranking, perceptions and impressions, whether accurate or not, are crucial when formulating a rank list. We compared different cohorts of residents from different years and acknowledge that there are complex changes in each application year, such as program signaling, that could have impacted the results. Not all IM residents are required to sit for the ITE; however, our sample size is large and representative of the overall population with a high response rate.22  Finally, respondents were exclusively in IM residency programs; different specialties may have yielded different results.

This study adds to the growing body of literature on virtual interviewing. In addition to providing applicants with similarly valued information, virtual interviews reduce travel costs, produce fewer carbon emissions, and promote innovation among programs in developing new ways of showcasing their programs. Further research is needed to determine best practices in interview day format, content, and delivery to optimize the experience for both programs and applicants.

In this large cohort study evaluating the interview experiences of IM residents, we found statistically significant differences between residents’ perceptions of in-person and virtual interviews, though these differences were small. All respondents, regardless of format, reported a high degree of satisfaction with and confidence in the information they gathered from residency interviews.

The authors would like to acknowledge Michael Kisielewski, Assistant Director of Surveys and Research at the Alliance for Academic Internal Medicine, for his continual and ongoing contributions to scholarly work produced by the CDIM Survey and Scholarship Committee.

1. 
Hammoud
MM,
Standiford
TC,
Carmody
JB.
The 2020-2021 residency application cycle: lessons learned and lingering problems
.
JAMA
.
2021
;
325
(
22
):
2249
-
2250
.
2. 
Association of American Medical Colleges
.
Interviews in GME: where do we go from here?
.
3. 
Accreditation Council for Graduate Medical Education
.
Coalition for Physician Accountability recommendations on virtual interviews for 2021-2022 recruitment season
.
4. 
Ashlagi
I,
Love
E,
Reminick
JI,
Roth
AE.
Early vs single match in the transition to residency: analysis using NRMP data from 2014 to 2021
.
J Grad Med Educ
.
2023
;
15
(
2
):
219
-
227
.
5. 
Association of American Medical Colleges
.
The cost of interviewing for residency
.
6. 
Donahue
LM,
Morgan
HK,
Peterson
WJ,
Williams
JA.
The carbon footprint of residency interview travel
.
J Grad Med Educ
.
2021
;
13
(
1
):
89
-
94
.
7. 
Huppert
LA,
Hsiao
EC,
Cho
KC,
et al.
Virtual interviews at graduate medical education training programs: determining evidence-based best practices
.
Acad Med
.
2021
;
96
(
8
):
1137
-
1145
.
8. 
Wolff
M,
Burrows
H.
Planning for virtual interviews: residency recruitment during a pandemic
.
Acad Pediatr
.
2021
;
21
(
1
):
24
-
31
.
9. 
Artino
AR
Jr,
La Rochelle
JS,
Dezee
KJ,
Gehlbach
H.
Developing questionnaires for educational research: AMEE guide no. 87
.
Med Teach
.
2014
;
36
(
6
):
463
-
474
.
10. 
Magee
C,
Rickards
G,
Byars
LA,
Artino
AR
Jr
.
Tracing the steps of survey design: a graduate medical education research example
.
J Grad Med Educ
.
2013
;
5
(
1
):
1
-
5
.
11. 
Lee
E,
Terhaar
S,
Shakhtour
L,
et al.
Virtual residency interviews during the COVID-19 pandemic: the applicant’s perspective
.
South Med J
.
2022
;
115
(
9
):
698
-
706
.
12. 
Huppert
LA,
Hsu
G,
Elnachef
N,
et al.
A single center evaluation of applicant experiences in virtual interviews across eight internal medicine subspecialty fellowship programs
.
Med Educ Online
.
2021
;
26
(
1
):
1946237
.
13. 
Muallem
E,
Burrows
H,
Wolff
M.
Planning for virtual interviews: residency recruitment during a pandemic—an update
.
Acad Pediatr
.
2023
;
23
(
7
):
1309
-
1311
.
14. 
Shen
AH,
Shiah
E,
Sarac
BA,
et al.
Plastic surgery residency applicants’ perceptions of a virtual interview cycle
.
Plast Reconstr Surg
.
2022
;
150
(
4
):
930
-
939
.
15. 
Carmody
JB,
Rosman
IS,
Carlson
JC.
Application fever: reviewing the causes, costs, and cures for residency application inflation
.
Cureus
.
2021
;
13
(
3
):
e13804
.
16. 
Manjunath
V,
Morrill
T.
Interview hoarding
.
Theor Econ
.
2023
;
18
(
2
):
503
-
527
.
17. 
Boyd
CJ,
Ananthasekar
S,
Vernon
R,
King
TW,
Saadeh
PB.
Interview hoarding: disparities in the integrated plastic surgery application cycle in the COVID-19 pandemic
.
Ann Plast Surg
.
2021
;
87
(
1
):
1
-
2
.
18. 
Association of American Medical Colleges
.
Open letter on residency interviews from Alison Whelan, MD, AAMC Chief Medical Education Officer
.
19. 
Tout
W,
Oyola
S,
Sharif
Z,
VanGompel
EW.
Applicant evaluation of residency programs in a virtual format: a mixed-methods study
.
Fam Med
.
2022
;
54
(
10
):
804
-
813
.
20. 
Kirkendoll
SD,
Carmody
JB,
Rhone
ET.
Information quality for residency applicants in Fellowship and Residency Electronic Interactive Database (FREIDA) and program websites
.
Cureus
.
2021
;
13
(
3
):
e13900
.
21. 
Pollock
JR,
Weyand
JA,
Reyes
AB,
et al.
Descriptive analysis of components of emergency medicine residency program websites
.
West J Emerg Med
.
2021
;
22
(
4
):
937
-
942
.
22. 
Association of American Medical Colleges
.
2023 Report on Residents, Table B3. Number of Active Residents, by Type of Medical School, GME Specialty, and Gender
.

The online supplementary data contains the survey used in the study.

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

Supplementary data