During recruitment, program directors (PDs) embody roles that run the gamut from marketing director, webmaster, adviser, and advocate to used-car salesman, interrogator, proud parent, and spin doctor. The residency application process begins immediately after the previous year's Match cycle (figure). Initially, PDs examine the previous Match for (1) matched programs of ranked applicants; (2) characteristics of matched and unmatched ranked applicants; (3) applicant characteristics and timing of interview cancellations; (4) lowest-ranked Match compared with previous years; (5) recruitment communication, marketing, faculty and resident involvement, and interviews; and (6) overall costs. Next, applicants from a PD's home institution schedule appointments with the PD to confirm information provided by student affairs deans, clerkship directors,1  and various websites regarding the application process.

figure

Recruitment Cycle for a Program Director

figure

Recruitment Cycle for a Program Director

Close modal

By September 15, PDs are inundated with applications far in excess of the time allotted for review. For example, if an anesthesiology PD at our institution uses 10 minutes per application review, and the program routinely receives more than 1000 applications for 8 positions, then at least 167 hours are required to review all applicants. If the PD reviews all applications, without breaks, 8 hours a day for 2 nonclinical days per week, it would take more than 10 weeks to review all applications! Thus, the reality is that some deserving applicants do not receive a comprehensive review; PDs increasingly work during nights, weekends, and clinical time, and associates must assist in the process.

The interview process varies greatly among specialties and programs, yet all programs experience the financial and time-related costs of interviewing. Clinical service and educational disruptions punctuate the interview period from October to February.

When applications outnumber positions by 120 to 1, as typically occurs for competitive programs, it may appear that programs have the advantage in the balance between supply and demand. However, by dividing the process into 2 phases, the application and the interview, the balance shifts. In the earlier anesthesiology example, a subset of roughly 150 top-tier candidates apply and are granted the vast majority of interviews to the same set of regional, peer schools with similar demographics. Consequently, demand outstrips supply once programs reach the interview phase, as roughly 350 positions are available at peer-grouped programs. Applicants offered interviews may keep their interview slots until the last minute, which prevents other candidates from being interviewed. This puts 200 positions at risk for not matching. Moreover, applicants who are not top tier are unnecessarily shunted to less competitive programs or a non-match. Because of interview hoarding by top applicants, some PDs report forgoing interview offers to top applicants who historically have matched elsewhere and instead favor lower-tier candidates.2 

Another unintended consequence of overapplying is the urgency for PDs to offer early interviews. The PDs know that applicants apply to far more programs than they will ultimately visit for an interview. To pitch their program and avoid cancellations, PDs make early interview offers, prior to the release of the Medical Student Performance Evaluation (MSPE), with interview dates as early as October. Early interview offers prevent holistic review, however, because decisions are made based on incomplete applications.

Application creep occurs primarily because “we” tell students to apply to more programs. We, the student affairs deans, look at the prior year application numbers and figuratively push down on the accelerator. We, the accreditation and clinical association bodies, forge an aggressive campaign for Medicare dollars based on the premise of a looming graduate medical education position shortfall. We, the organizations that sponsor the Match and application process, fail to implement effective process improvements to curb the overapplying frenzy despite early warnings that the process is out of control.

Other reasons include the ease of applying, lack of fourth-year curriculum requirements, low-intensity program interviews, and basic game theory economics. The ease of the application process leads to overapplying when adding programs to an application list requires only a few clicks on the common application.3  Further, the personal statement does not push students to engage in thoughtful responses to questions.

Medical schools and residency programs deserve some blame as well. At 1 school in 2014–2015, only 40% of the students took a clinical elective during interview season, and less than 50% of the students took one after Match Day.4  Consequently, some students have no curriculum-related constraints. By fostering a culture of unstructured interviews that require almost no preparation, residency programs do not dissuade applicants who are only modestly interested in a program from interviewing.5 

A game theory concept known as the prisoner's dilemma has been posited by Weissbart et al2  to explain the annual increase in applications by medical school applicants to residency programs. Game theory addresses the interactions among individuals during decision making. The prisoner's dilemma posits that all members of a group are worse off when each member of the group acts in his or her own self-interest.6  Weissbart et al2  demonstrated that the probability of matching does not increase with an increasing number of applications. As an example to illustrate the Match dilemma, we suggest an application ideal value that uses 2005 program-specific averages as a baseline adjusted for the difference between total applicants and total positions offered annually (table).7,8  The vast majority of applicants must adhere to the ideal application strategy rather than the overapply strategy for this approach to be successful; this is unlikely for one-time negotiations.

table

The Match Dilemma Payoff Matrix for US Graduates

The Match Dilemma Payoff Matrix for US Graduates
The Match Dilemma Payoff Matrix for US Graduates

From a PD's perspective, medical school records that accurately depict a struggling student are largely nonexistent. Attempting to save the lowest-performing students from failure reflects not only a medical disposition to heal but also US culture; as Garrison Keillor noted in A Prairie Home Companion, “all the children are above average”9  in the fictional town of Lake Wobegon. For several medical schools, more than 90% of students are graded as honors or high pass for clerkships or are ranked in the top 2 tiers on the MSPE.10  Comments on the MSPE are overwhelmingly positive for even the lowest tier of students. Further, standardized letters of recommendation, used by emergency medicine to improve transparency, reveal ranking distributions for letter writers but also tend toward advocacy as the vast majority of responses are in the top 2 deciles.10  Some letter writers have mentioned that their deans prohibit them from completing standardized letters of recommendation that require student rankings. Hence, medical schools embolden students to apply to reach programs by obscuring their actual ranking.

Because Match participants cannot be counted on to self-regulate applications due to the aforementioned reasons, an outside group, acting on behalf of student applicants, should tackle this mounting problem. Consider the following:

  • 1. 

    Association of American Medical Colleges (AAMC): The AAMC owns and operates the Electronic Residency Application Service. The AAMC does not believe it should limit the number of programs to which an individual may apply (R Overton, MBA, written communication, February 26, 2016). Substantially increasing application costs (eg, raising the cost of applying to more than 31 programs to $500 per program) unfairly favors applicants who possess more financial resources. Hence, limiting applications, either by decree or by substantial cost differentials, is unlikely to initiate from the AAMC.

  • 2. 

    National Resident Matching Program (NRMP): The NRMP was established in 1952 at the request of medical students to provide an orderly and fair mechanism for matching. The organization's mission, “to match health care professionals to graduate medical education and advancing training programs through a process that is fair, efficient, transparent, and reliable,” addresses the mandate for action to improve the efficiency of the Match.11 

Both the AAMC and the NRMP are more likely to provide education materials and data to support medical students and their advisers in an effort to curb applications, rather than to impose application limits. As we do not believe these efforts will be sufficient, the NRMP should impose application limits, using adjusted, 2005 program-specific averages as a baseline, which would not worsen applicants' Match probability.

A more aggressive strategy would be to change the current single round of applications, plus the Supplemental Offer and Acceptance Program (SOAP), to a 3-round format for applications. This would permit up to 5 programs per round before the SOAP process. The third round could allow unlimited applications with tiered pricing as in the current system. Further, this proposal permits applications to 1 or more backup specialties. This strategy has a precedent: in Europe, the University and Colleges Admissions Services successfully limits the first round of applications to 5 courses of study at up to 5 institutions and then permits an extra application in a second round.12 

It has been suggested that medical students would stop overapplying and applying to out-of-reach programs if they had an accurate composite of applicants who matched to a particular program.13  This approach, however, could have considerable drawbacks from our perspective: (1) data reported from small programs would be identifiable, or small programs would need to combine several years of data and thus be outdated; (2) the academic diversity of trainees in a program is surprisingly vast (eg, posting an average United States Medical Licensing Examination [USMLE] Step 1 score of 220 doesn't capture the fact that 1 trainee actually failed the boards on the first attempt due to a valid reason, or that 2 other trainees scored 250, while a fourth scored 203); and (3) such a database might impede a program's recruitment of top candidates despite improvement efforts.

For a holistic review, medical schools need to provide fair and honest evaluations of each student's knowledge, skills, and attributes. Objective data derived from milestones and core entrustable professional activities may be helpful. Student rankings would allow PDs to redirect time currently spent deciphering the MSPE riddle for each school toward holistic consideration of nonscoring metrics. The PDs and their teams will require additional nonclinical time between mid-September and mid-October.

The Accreditation Council for Graduate Medical Education (ACGME) should deemphasize the annual evaluation of board pass rates for program accreditation. Instead, they should consider including other outcome measures of trainee performance in the workforce, such as legal claims or complication rates.14  Many board examinations are not completed until months or even years after completion of residency training, which seriously calls into question a program's ability to adequately prepare candidates for certification examinations and may reflect instead relatively brief periods of intense study of review course materials. As a result of ACGME's scrutiny of board pass rates, programs strongly prefer candidates who can independently pass an examination. An ACGME focused review of a program to discuss low board scores sends a strong message that holistic review, often a euphemism for considering intriguing candidates who score poorly on standardized tests, can lead to program citation or worse.

First, the AAMC should reassess the unintended consequences of technological advances that improved the ease of applying at the expense of discretion. One solution would be to introduce mass customization. This concept allows for personalized applications, perhaps including video-based statements of interest, and responses to several program-directed essays in lieu of the current generic personal statement. For example, when hiring an au pair, a host family has the opportunity to select from a wide variety of personalized video profiles of applicants before proceeding with multiple video-chat interviews that allow both parties to be fully satisfied prior to mutual agreement.

Second, the ACGME should consider the board pass rate as part of a larger set of posttraining outcome metrics for accreditation consideration.

Third, medical schools should provide rankings of all students on the MSPE. The quid pro quo for schools providing meaningful data will be for programs to authorize the release of composite data on matched applicants from previous years and to release screening algorithms to the NRMP for applicant and medical school consideration. A single specialty trial over 3 recruitment cycles is warranted to determine if concerns over transparency are founded.

Fourth, PDs should expand on semistructured interviews by incorporating elements of behavioral and situational interview techniques, such as multiple mini interviews,15  for residency recruitment.

Fifth, the NRMP should institute application limits for the first 2 rounds of a new 3-round application process plus SOAP. Up to 5 programs could be applied to at up to 5 institutions per round.

Opening meaningful dialogue between accreditations bodies, Match organizations, medical schools, students, PDs, and other interested parties is the first step to improve the current application and interview process. A series of regional or specialty-specific forums on overapplying in the residency application process could set the stage for recommending reforms and piloting a more efficient, informative, and transparent process.

1
Chretien
KC,
Elnicki
MD,
Levine
D,
Aiyer
M,
Steinmann
A,
Willett
LR.
What are we telling our students? A national survey of clerkship directors' advice for students applying to internal medicine residency
.
J Grad Med Educ
.
2015
;
7
(
3
):
382
387
.
2
Weissbart
SJ,
Kim
SJ,
Feinn
RS,
Stock
JA.
Relationship between the number of residency applications and the yearly match rate: time to start thinking about an application limit?
J Grad Med Educ
.
2015
;
7
(
1
):
81
85
.
3
Naclerio
RM,
Pinto
JM,
Baroody
FM.
Drowning in applications for residency training: a program's perspective and simple solutions
.
JAMA Otolaryngol Head Neck Surg
.
2014
;
140
(
8
):
695
696
.
4
Aagaard
EM,
Abaza
M.
The residency application process—burden and consequences
.
N Engl J Med
.
2016
;
374
(
4
):
303
305
.
5
Stephenson-Famy
A,
Houmard
BS,
Oberio
S,
Manyak
A,
Chiang
S,
Kim
S.
Use of the interview in resident candidate selection: a review of the literature
.
J Grad Med Educ
.
2015
;
7
(
4
):
539
548
.
6
Jeffrey
RC.
The Logic of Decision. 2nd ed
.
Chicago, IL
:
University of Chicago Press;
1965
:
15.
7
National Resident Matching Program
.
Impact of length of rank order list on main residency match outcome: 2002–2015
. ,
2016
.
8
Mullan
F,
Salsberg
E,
Weider
K.
Why a GME squeeze is unlikely
.
N Engl J Med
.
2015
;
373
(
25
):
2397
2399
.
9
A Prairie Home Companion with Garrison Keillor
. ,
2016
.
10
Kominsky
AH,
Bryson
PC,
Benninger
MS,
Tierney
WS.
Variability of ratings in the otolaryngology standardized letter of recommendation
.
Otolaryngol Head Neck Surg
.
2016
;
154
(
2
):
287
293
.
11
National Resident Match Program
.
What is the Match? About the NRMP
.
http://www.nrmp.org/about. Accessed April 6
,
2016
.
12
University and Colleges Admissions Services
.
Undergraduate choosing a course
. ,
2016
.
13
Christophel
JJ,
Levine
PA.
Too much of a good thing
.
JAMA Otolaryngol Head Neck Surg
.
2014
;
140
(
4
):
291
292
.
14
Asch
DA,
Nicholson
S,
Srinivas
S,
Herrin
J,
Epstein
AJ.
Evaluating obstetrical residency programs using patient outcomes
.
JAMA
.
2009
;
302
(
12
):
1277
1283
.
15
Patterson
F,
Knight
A,
Dowell
J,
Nicholson
S,
Cousans
F,
Cleland
J.
How effective are selection methods in medical education? A systematic review
.
Med Educ
.
2016
;
50
(
1
):
36
60
.