We appreciate Elliott and Carmody’s efforts to solve the issues in medical education related to scientific publications.1  As current students at an allopathic medical school, we can attest to the pervasive culture of a “research arms race”1  among our cohort from a boots-on-the-ground perspective. The irony of us authoring a letter that will count as a publication is not lost upon us. We feel, however, that the recommended solution proposed to limit the number of reportable publications on residency applications to program directors (PDs) is misguided, as it underemphasizes the importance of the current incentive structures in place for students and fails to account for other methods in which PDs can uncover a medical student’s research output with relative ease.

Medical students must have a way in which they can distinguish themselves from their peers if they hope to match into the residency program of their choice. PDs also rely on these objective metrics as screening measures for the overwhelming number of applications they receive on a yearly basis. Historically, the opportunity to accomplish this was through United States Medical Licensing Examination (USMLE) Step 1 scores. With the shift to a pass/fail grading model, however, medical students and PDs have lost this ability, which has substantially increased the importance placed on USMLE Step 2 scores and research output.

If the authors’ proposed solution was enacted, we agree that medical students would obviously list their most significant publications on their application. We disagree, however, with the assertion that this will alter medical students’ behavior in publishing. Medical students view research output as an important aspect of their application because PDs have clearly signaled to applicants that it is a highly influential metric utilized in the selection process, especially within surgical specialties. Perhaps if PDs completely lost the ability to assess an applicant’s total research output, its importance would subsequently decline. This is unlikely to ever occur and certainly would not with the authors’ proposed solution. Residency applications still usually require applicants to upload their curriculum vitae, where it is expected that they will list their research activity. Additionally, a simple PubMed database search of an applicant’s name can reveal their research output. With the ability to assess research output preserved, PDs are likely to continue to place high value on this metric. Medical students are well aware of this paradigm, so suggesting they will respond in the manner proposed by Elliott and Carmody is dubious.

Rather than solutions that merely limit how a medical student’s data is reported to PDs, we must pursue solutions that address the underlying issue at hand. Medical students respond to incentives. This aspect of medical education will never change. The incentives themselves, however, can. The importance that PDs place on research output has clearly incentivized medical students to increase their publications while in school. If we wish to solve this problem at its source, graduate medical education leadership must radically alter the importance they place on an applicant’s research output. Until this occurs, the incentive structure to obtain publications will be left intact, and medical students will respond accordingly.

1. 
Elliott
B,
Carmody
JB.
Publish or perish: the research arms race in residency selection
.
J Grad Med Educ
.
2023
;
15
(
5
):
524
-
527
.