Background Program signaling is an innovation that allows applicants to express interest in specific programs while providing programs the opportunity to review genuinely interested applicants during the interview selection process.

Objective To examine the influence of program signaling on “selected to interview” status across specialties in the 2022 Electronic Residency Application Service (ERAS) application cycle.

Methods Dermatology, general surgery-categorical (GS), and internal medicine-categorical (IM-C) programs that participated in the signaling section of the 2022 supplemental ERAS application (SuppApp) were included. Applicant signal data was collected from SuppApp, applicant self-reported characteristics collected from the MyERAS Application for Residency Applicants, and 2020 program characteristics collected from the 2020 GME Track Survey. Applicant probability of being selected for interview was analyzed using logistic regression, determined by the selected to interview status in the ERAS Program Director’s WorkStation.

Results Dermatology had a 62% participation rate (73 of 117 programs), GS a 75% participation rate (174 of 232 programs), and IM-C an 86% participation rate (309 of 361 programs). In all 3 specialties examined, on average, signaling increased the likelihood of being selected to interview compared to applicants who did not signal. This finding held across gender and underrepresented in medicine (UIM) groups in all 3 specialties, across applicant types (MDs, DOs, international medical graduates) for GS and IM-C, and after controlling for United States Medical Licensing Examination Step 1 scores.

Conclusions Although there was variability by program, signaling increased likelihood of being selected for interview without negatively affecting any specific gender or UIM group.

A steadily increasing number of applications to US residency programs over the last decade represents a critical challenge to the resident selection system.1  Programs with limited resources available to review increasingly large numbers of applications frequently overemphasize academic metrics in the screening process over a holistic review of applications to identify a better prepared and more diverse class of residents.2  Program directors seeking to identify prepared applicants who are genuinely interested in their programs at the screening stage are met with a lack of sufficient tools to meaningfully review the large volume of applications, so program signaling was introduced in 2020 to help address this challenge.

The idea of signaling in the residency application process has been raised by a number of specialties, with some recognizing the potential benefit by identifying applicants who are truly interested in a program, and others acknowledging the potential stress that this type of system may put on an applicant to determine which programs to signal.3-6  The appeal of signaling is that it may be more equitable and transparent than other informal ways to “signal” programs, such as visiting rotations, faculty/mentors, or self-advocacy, because it doesn’t have a financial cost and is accessible to all applicants.

Otolaryngology piloted preference signaling during the 2021 residency application cycle.6  Applicants and program directors had positive reactions to the pilot in terms of perceived utility and satisfaction.7  Results showed the rate of receiving an interview offer was significantly higher from signaled programs (58%) compared with non-signaled programs (14%), a significant finding across all competitiveness quartiles.8  Although these initial results are promising for preference signaling, there are 2 main shortcomings. First, otolaryngology is a small, highly competitive specialty with few osteopathic and international applicants,9,10  and therefore there are questions about the generalizability of their findings to other specialties. Second, signals have not yet been evaluated for equity or fairness across different demographic groups.

This study explores the relationship between program signals and likelihood of being selected for interview with 3 specialties: dermatology, general surgery-categorical (GS), and internal medicine-categorical (IM-C). Furthermore, we investigated whether this process is equitable among applicants based on gender, underrepresented in medicine (UIM) status, and type of applicant (ie, allopathic [MD], osteopathic [DO], or international medical graduate [IMG]).

What Is Known

Program signaling holds promise for communicating applicants’ genuine interest in specific programs. As this has been a novel innovation in the past 2 years, data around outcome measures is still emerging.

What Is New

In the 3 pilot specialties of internal medicine, general surgery, and dermatology, analysis of the Association of American Medical Colleges signaling data demonstrated that signaling was associated with an increased likelihood of being invited to interview without negatively affecting any specific gender or underrepresented in medicine group.

Bottom Line

These findings provide pilot data around signaling, allowing program directors to make better-informed decisions about how to use signaling information. Applicants will benefit from understanding this data as they plan their application strategy as well.

Coauthors who were Association of American Medical Colleges (AAMC) employees were able to access and analyze all data as admissions and selection research and development research team members. Members of the author team who were not AAMC employees did not have access to data files but were able to review results.

Program Samples

To be included in the final analytic sample, programs met the following inclusion criteria: (1) programs had to participate in the supplemental Electronic Residency Application Service (ERAS) application (SuppApp) pilot in the 2022 ERAS cycle; (2) one or more applicants signaled their program; and (3) they met at least a 7:1 ratio of interview selections reported in the ERAS Program Director’s WorkStation (PDWS) per available residency positions. The 7:1 selected to interview per available residency position inclusion rule was created based on feedback from program directors to reflect realistic average behavior across programs in order to mitigate risk of including programs with incomplete interview selection data in the PDWS. The number of programs and applicants who met inclusion criteria for each sample are included in the Figure, including the analytic samples for gender, UIM, and applicant type. For these analytic samples, programs were excluded if a logistic regression model was unable to be fit for that program (based on returning an error code if there was an insufficient number of both signaling and non-signaling applicants in each category).The final analytic samples for applicant type included MD, DO, and IMG graduates, except for dermatology, where the number of DO and IMG applicants was too small to analyze.

Figure 1

Inclusion Rule Flowchart for Dermatology, Internal Medicine-Categorical, and General Surgery Analytic samples

a Applicants were excluded from the final analytic sample if they only sent program signals to programs that did not meet the study’s program inclusion criteria.

Abbreviations; IM-C, internal medicine-categorical; GS, general surgery; SuppApp, supplemental Electronic Residency Application Service application; UIM, underrepresented in medicine.

Figure 1

Inclusion Rule Flowchart for Dermatology, Internal Medicine-Categorical, and General Surgery Analytic samples

a Applicants were excluded from the final analytic sample if they only sent program signals to programs that did not meet the study’s program inclusion criteria.

Abbreviations; IM-C, internal medicine-categorical; GS, general surgery; SuppApp, supplemental Electronic Residency Application Service application; UIM, underrepresented in medicine.

Close modal

Predictor Data

All predictor and outcome data were accessed by internal AAMC research data stewards.

Program (Preference) Signals:

Program signals were binary indicators of an applicant’s interest in a program at the time of application. Program signals were collected via the SuppApp for the 2022 application cycle from September 1 through 30, 2021. IM-C and GS applicants had the opportunity to send up to 5 program signals; dermatology applicants could send 3. All applicants were instructed not to send signals to their home program or a program where they completed an in-person away rotation or subinternship.

Applicant Characteristics:

Applicants’ gender, race/ethnicity, applicant type, and most recent United States Medical Licensing Examination (USMLE) Step 1 score were collected from the MyERAS application. Due to small sample sizes for some racial/ethnic groups, race/ethnicity was collapsed into 2 groups for analysis: UIM and non-UIM.11  UIM was defined as anyone in the sample who self-identified as one or more of the following racial and ethnic categories: American Indian or Alaska Native; Black or African American; Hispanic, Latino, or of Spanish Origin; or Native Hawaiian or Other Pacific Islander. Non-UIM was defined as anyone in the sample who self-identified as only White or Asian. Those who self-identified as “Other” alone were not included in these analyses.

Program Characteristics:

The number of entering residents in 2020, geographic region, and average Step 1 score were used to describe programs in the sample. These data were collected from the 2020 GME Track Survey.

Outcome Data

Selected to Interview Status:

The chief outcome of interest was “selected to interview” status, a binary indicator provided by each participating program indicating whether the applicant was selected for interview at that program. Selected to interview status was collected in the PDWS and does not include data from programs that use a third-party interview scheduling tool.

Analyses

All analyses were conducted using R (The R Foundation). A series of logistic regression analyses were conducted separately for each program and by applicant group because programs differed greatly in how signals were incorporated into their selection process and into the differing qualifications of each applicant group.

Model 1 explored the relationship between applicants’ signal status and whether they were selected to interview for each program. Signal status and selected to interview were treated as binary variables.

Model 2 explored the relationship between applicants’ signal status and whether they were selected to interview, while accounting for the most recent USMLE Step 1 score. For the regression analyses in model 2, USMLE Step 1 scores were treated as a continuous covariate. However, for simplicity of presentation, probability results are displayed for 3 USMLE Step 1 score tercile categories, with each tercile corresponding to a range of scores that divides the applicant pool roughly into the bottom third, middle third, and top third of scores for each specialty.

Results were aggregated across programs by computing the median probability of receiving an interview, the median 95% confidence interval across programs, and the minimum and maximum predicted probabilities.

This study was reviewed by the AAMC Human Subjects Research Protection Program, and data was approved for publication by the institutional review board of the American Institutes for Research. Participants provided consent for their data to be used in research as part of submitting their applications using the ERAS and interview data using the PDWS.

As shown in online supplementary data Table 1, analytic samples were generally representative of each specialty’s program population. Table 1 summarizes the overall analytic sample for each specialty by applicant demographic groups.

Table 1

Demographics, Mean Number of Applications, Signals, and Selected to Interview Status for Applicants in the Overall Analytic Sample of Each Specialty

Demographics, Mean Number of Applications, Signals, and Selected to Interview Status for Applicants in the Overall Analytic Sample of Each Specialty
Demographics, Mean Number of Applications, Signals, and Selected to Interview Status for Applicants in the Overall Analytic Sample of Each Specialty

Table 2 summarizes the mean, standard deviation, and range for key variables in the overall analytic sample for dermatology, IM-C, and GS. In all 3 specialties examined, on average, signaling was associated with a statistically significant increase in the likelihood of being selected to interview compared to applicants who did not signal (Table 3).

Table 2

Descriptive Statistics for Each Specialty Using the Overall Analytic Sample

Descriptive Statistics for Each Specialty Using the Overall Analytic Sample
Descriptive Statistics for Each Specialty Using the Overall Analytic Sample

In all 3 specialties, there was considerable variability in the effect of signaling by program, with the median predicted probabilities ranging from .05 to .80 for dermatology, .06 to .99 for IM-C, and .03 to .70 for GS, suggesting that programs used signals differently, and thus the “value” of a signal differed by program.

Additionally, for all 3 specialties examined, the finding that signaling increases rates of being selected to interview did not vary by gender (see Table 3 and Figure, online supplementary data FIGURES 2, 5, 9) or UIM status (see Table 3 and Figure, online supplementary data FIGURES 3, 6, 10), with no statistically significant differences in the average probabilities for each group except for UIM status for IM-C.

However, the relationship between signaling and being selected to interview did vary by applicant type for IM-C and GS applicants (see Figure, online supplementary data FIGURES 7, 11).

For all 3 specialties, signaling increases rates of being selected to interview, even after accounting for applicants’ USMLE Step 1 scores. Results also show that signaling moderates the relationship between USMLE Step 1 scores and being selected to interview, with the relationship being stronger for applicants who had higher Step 1 scores across all specialties (online supplementary data Table 2 and FIGURES 12-17).

Table 3

Predicted Probability of Being Selected to Interview for Each Specialty

Predicted Probability of Being Selected to Interview for Each Specialty
Predicted Probability of Being Selected to Interview for Each Specialty

Program signals are one of several innovations that have been introduced into the residency recruitment process,12-14  and the current study is the largest and first multispecialty study investigating the impact of program signals on the likelihood of being selected to interview. The findings demonstrate that program signals significantly increase the likelihood of an applicant being selected to interview across gender, UIM groups, and applicant type, even after accounting for USMLE Step 1 scores. Although MD applicants who did not signal were more likely to be selected for interview than IMGs who did signal, signaling a program still increased the rate of being selected for interview for IMG applicants.

Program signaling may be an attractive option to reduce ballooning application numbers and cost as well as barriers to holistic review. Signals are not intended, however, to be a sole determinant of an interview invitation or rank order; they should always be used in the context of a holistic, comprehensive review of an applicant’s unique attributes and how they intersect with program values.

As with all studies, some limitations exist. These findings represent results for a single year of selection that need to be replicated in future cycles, and programs can and do use platforms outside of PDWS for interview invitations. The inclusion rule chosen to mitigate this limitation also may not apply to all programs. However, the majority of programs in the included specialties used PDWS to indicate applicants selected for interview at least once during the 2022 ERAS cycle (97%, 94%, and 77% of programs in IM-C, GS and dermatology respectively), and the characteristics of the programs in the study samples are largely representative of population characteristics, pointing to generalizability. Additionally, this is a retrospective study that makes use of data from a previous cycle. USMLE Step 1 scores were included to provide a complete and accurate picture of the selection process at the time (with residency program directors potentially making use of scores rather than pass/fail status to make selection decisions); inclusion of this data does not endorse use of USMLE scores for admissions and selection decisions by programs. Finally, the current study does not assess how program characteristics impact an applicant’s signal distribution, which represents an avenue for future research.

Though program signaling increased interview selection rates overall, the 3 specialties vary considerably on a number of factors, including the number of residency positions and corresponding number of applications received, the percentage of matched applicants and programs, and percentage of applicant types. The same can be said for individual programs within specialties, suggesting that programs may use signals differently in deciding who to select for interview. The current study does not investigate how programs individually incorporate a program signal into their interview processes, and further investigation may reveal differential trends based on program characteristics, such as application volume, number of signals received, or the importance of other tools in the selection process. Within applicant types, program signaling had the greatest impact for MD applicants followed by DOs; therefore, more research is needed to understand why signaling confers less value to IMGs and DOs than MDs, after controlling for USMLE Step 1 scores.

Overall, these findings demonstrate that signaling can increase interview selection rates at signaled programs for applicants, on average, across 3 specialties, building upon previous similar findings for applicants in otolaryngology.10  These findings hold across Step 1 score ranges and also across differing gender and ethnic groups, demonstrating that signaling does not appear to function differentially for varying subgroups in the 2021 applicant samples.

The authors would like to thank Aileen Dowden from the Association of American Medical Colleges Admissions, Selection, Research and Development team for her contributions to analyses, results, tables, and figures, as well as Erin Helbling for her review of formatting, citations and content. The authors also wish to thank the Alliance for Academic Internal Medicine (AAIM), the Association of Program Directors in Surgery (APDS) and the Association of Professors of Dermatology for their continued engagement and fostering stakeholder support of program signaling.

1. 
Association of American Medical Colleges. ERAS statistics 2022
.
Published 2022
.
2. 
Aibana
O,
Swails
JL,
Flores
RJ,
Love
LT.
Bridging the gap: holistic review to increase diversity in graduate medical education
.
Acad Med
.
2019
;
94
(8)
:
1137
-
1141
.
3. 
Salehi
PP,
Benito
D,
Michaelides
E.
A novel approach to the National Resident Matching Program—the star system
.
JAMA Otolaryngol Head Neck Surg
.
2018
;
144
(5)
:
397
.
4. 
Bernstein
J.
Not the last word: want to match in an orthopedic surgery residency? Send a rose to the program director
.
Clin Orthop Relat Res
.
2017
;
475
(12)
:
2845
-
2849
.
5. 
Whipple
ME,
Law
AB,
Bly
RA.
A computer simulation model to analyze the application process for competitive residency programs
.
J Grad Med Educ
.
2019
;
11
(1)
:
30
-
35
.
6. 
Chang
CWD,
Pletcher
SD,
Thorne
MC,
Malekzadeh
S.
Preference signaling for the otolaryngology interview market
.
Laryngoscope
.
2020
;
131
(3)
:
e744
-
e745
.
7. 
Chang
CWD,
Thorne
MC,
Malekzadeh
S,
Pletcher
SD.
Two-year interview and match outcomes of otolaryngology preference signaling
.
Otolaryngol Head Neck Surg
.
2023
;
168
(3)
:
377
-
383
.
8. 
Pletcher
SD,
Chang
CWD,
Thorne
MC,
Malekzadeh
S.
The otolaryngology residency program preference signaling experience
.
Acad Med
.
2021
;
97
(5)
:
664
-
668
.
9. 
Association of American Medical Colleges
.
ERAS statistics 2023
.
Published 2023. Accessed September 22, 2023. https://www.aamc.org/data-reports/data/eras-statistics-data
10. 
National Resident Matching Program
.
Results and data: 2023 Main Residency Match
.
11. 
Association of American Medical Colleges
.
Underrepresented in medicine definition
.
12. 
Caretta-Weyer
HA.
An outcomes-oriented approach to residency selection: implementing novel processes to align residency programs and applicants
.
Acad Med
.
2022
;
97
(5)
:
626
-
630
.
13. 
Radabaugh
CL,
Hawkins
RE,
Welcher
CM,
et al.
Beyond the United States Medical Licensing Examination score
.
Acad Med
.
2019
;
94
(7)
:
983
-
989
.
14. 
Bird
SB,
Hern
HG,
Blomkalns
A,
et al.
Innovation in residency selection: the AAMC standardized video interview
.
Acad Med
.
2019
;
94
(10)
:
1489
-
1497
.

The online version of this article contains further data from the study.

Funding: This project was supported by the Association of American Medical Colleges (AAMC) as part of their routine operating budget.

Conflict of interest: Bobby Naemi, PhD, and Dana Dunleavy, PhD, are employees of the AAMC.

A subset of these data were previously presented at the NRMP Transition to Residency Conference, October 6-8, 2022, San Diego, California.

Supplementary data