“We need less research, better research, and research done for the right reasons.”1 So began a 1994 editorial in the British Medical Journal by statistician Douglas G. Altman. Noting that incentives for career advancement led many physicians to conduct research that was inappropriately designed, incorrectly analyzed, selectively interpreted, or outright fraudulent, Altman argued for abandoning the use of publication quantity as a measure of ability.
One might hope that Altman’s words would have inspired systemic change in the nearly 3 decades following his eloquent editorial. Instead, it seems the publish-or-perish arms race has spread to medical trainees. The Figure shows the dramatic increase in PubMed-indexed research publications with medical student authors in the past 15 years. While these data are limited in their ability to represent all medical student research, they are a sample that alludes to an alarming trend.
This trend would be worth celebrating if this increase in publications represented flourishing science. Yet an analysis of PubMed-indexed publications found that most medical student articles were reviews or case reports, and the majority (59%) were cited not even once.2 Rather than scientific curiosity, the burgeoning research output by medical students is partially a consequence of the residency selection process.
At competitive residency programs, program directors increasingly use research productivity to discriminate among applicants. In the National Resident Matching Program’s most recent survey, program directors rated involvement and interest in research as being as important as membership in Alpha Omega Alpha or the Gold Humanism Honor Society when deciding whom to interview or rank,3 and 41% of program directors in a different national survey reported that research participation will become more important after the transition to pass/fail scoring for Step 1 of the United States Medical Licensing Examination (USMLE).4 In every specialty, the mean number of research experiences listed in residency applications is higher for graduating MD students who match compared to those who do not.5 Unsurprisingly, medical students report their primary motivation to perform research is out of professional necessity to advance,6 and they cite the desire to increase their competitiveness for residency applications as the most common reason for taking dedicated research years.7
The result is a research arms race among residency applicants. The Table shows the increase in research abstracts, publications, and presentations among matched US MD students across every specialty. Successful applicants in competitive specialties such as dermatology, neurological surgery, and plastic surgery now report more than 20 research items on average, and even primary care specialties have seen a 2- to 3-fold increase in this metric. In fact, in most specialties, the average unmatched applicant today reports more research items than the average matched applicant did 10 years ago.5,8 Yet there is little—if any—evidence that residents today are any more prepared for residency than those in previous eras. Moreover, evidence suggests that while research in residency may correlate with clinical performance, research in medical school does not.9-11 But this research arms race shows no sign of slowing. In fact, 60% of medical students plan to redirect time previously spent studying for a scored USMLE Step 1 to more research activity.12
A Potential Solution
Permitting residency applicants to list only 3 to 5 research publications, abstracts, and presentations could mitigate this academic inflation. Rather than padding their applications with more low-quality science in the hope of catching a program director’s attention, applicants would have an incentive to pursue more rigorous and higher-quality projects to fill the limited space on their applications. Such a policy would be consistent with the Association of American Medical Colleges’ recent change to limit applicants to listing only 10 experiences (including work, volunteer, and research experiences) starting in the 2023-2024 residency application season.13 Removing the quantitative component of research evaluation could also promote a holistic review more focused on compatibility between applicants and programs.14
But just as the research arms race has emerged as a byproduct of a residency selection process intended to encourage and reward merit, this policy change should be considered through the lens of potential unintended consequences.
Would Limiting Research Hurt Applicants?
For some residency applicants, prodigious research output may counterbalance weaknesses elsewhere in the application. Limiting the number of research items that an applicant could report could prevent these individuals from competing for residency positions in the way they perceive they are best equipped. Yet such applicants could still demonstrate their merit to programs by doing higher-quality research.
Further, when considering research output as a measure of worth for graduate medical education, it must be appreciated that there are wide disparities in applicants’ access to research opportunities. There is scant evidence that unlimited research output “levels the playing field” for applicants from lesser-known or international medical schools. Instead, there are systematic differences in publication count for students by sex, race, and medical school rank.15 Indeed, the fact that more famous and well-resourced institutions provide easier access to mentors and projects may be one of the most important ways in which these institutions reproduce and perpetuate status hierarchies.16
Would Limiting Research Hurt Science?
Advances in medical science continue to improve patient care—and some of the most important discoveries in medicine (including heparin, insulin, penicillin, ether anesthesia, spermatozoa, and the sinoatrial node) were made by or with crucial contributions from medical students.17 Yet biomedical science in the 21st century is a sophisticated endeavor. Exploring its frontiers requires complex techniques and extensive experience. Today’s medical students aren’t likely to stumble upon major discoveries unless they are participating in high-quality projects.
Even without making groundbreaking discoveries, participating in research that is adequately resourced and well-mentored may help students develop critical thinking skills and an appreciation for the challenges of producing high-quality research. Limiting the number of items that can be considered during the initial review of the residency application will not negate these benefits.
Would Limiting Research Hurt Physician-Scientists?
Some trainees who initially participate in research to enhance their residency application may be inspired to careers as physician-scientists.18 Yet it’s doubtful that a medical student disinclined to pursue a research career after their first 3 to 5 publications would be convinced by another 10.
Would Limiting Research Lower Standards?
Reforms to medical education often prompt concern that standards are being lowered and that patients and the profession will suffer by training physicians who are less grounded in biomedical science.19 Limiting the number of research outputs that could be listed on an application would not remove research as a component of residency selection or the incentive for applicants to pursue it. Instead, it would incentivize quality over quantity and direct applicants’ efforts toward contributing to meaningful science.
Ending the Research Arms Race
So long as the number of residency applicants to some specialties exceeds the number of positions, residency selection will remain a high-stakes and highly competitive process. Program directors will seek measures of merit, and applicants will aspire to demonstrate whatever measures of merit programs value. The natural result is an arms race.
In defining measures of merit, graduate medical education has a responsibility to ensure that applicants compete in ways that benefit patients or the profession rather than just providing relative advantage in the zero-sum selection game. Research is valuable, but its value is not based on quantity. Graduate medical education programs can choose to structure the incentives for trainee research in a way that furthers science and patient care—or in a manner that generates research pollution and fuels a senseless arms race. As Altman concluded his seminal editorial, “As the system encourages poor research it is the system that should be changed… Abandoning using the number of publications as a measure of ability would be a start.”1
Disclaimer: The views expressed in this paper are those of the authors and do not necessarily represent the official position or policy of the Department of Defense or United States Air Force.