Generative artificial intelligence (GAI) refers to a subgroup of artificial intelligence (AI) systems trained on large collections of data to produce content that can be indistinguishable from that generated by humans. For medical educators involved in residency selection processes, the widespread availability of text-based GAI tools such as ChatGPT presents novel challenges surrounding concerns of accuracy, authenticity, and integrity when evaluating applications.1-4  How should educational societies, residency programs, and program directors view the use of GAI in residency applications? In this article, we explore how applicants might use tools like ChatGPT during the application process and consider the potential implications for educators involved in resident selection. We ultimately argue that GAI tools like ChatGPT are a contemporary reality and applicants should not be restricted from using them to improve their applications.

The strength of GAI programs lies in their versatility for different writing styles and their extensive knowledge bases. Programs like ChatGPT can generate text in a conversational exchange based on patterns it has learned from vast amounts of text-based data and human feedback. As such, GAI can be used in several ways to support medical students during the application cycle. Most evidently, it can assist in crafting and refining students’ personal statements or other text-based components of their applications. Recent studies have found that reviewers cannot distinguish between personal statements written by GAI and those written by human applicants.2,3  For example, one study comparing 5 applicant-written personal statements and 5 ChatGPT-written personal statements for plastic surgery residency found that blinded faculty participants rated them similarly for readability, originality, authenticity, and overall quality.3 

Naturally, however, if students were to use entirely fabricated essays, it would render these statements completely impersonal and useless in the selection process.4  Short of ghostwriting a statement in totality, GAI could be used in a range of more acceptable ways to assist or enhance personal writing. The United Kingdom’s shared admissions service for higher education, for instance, suggests GAI could be used to brainstorm ideas, help with structuring essays, and proofread for readability.5  For example, in early stages, an applicant could input raw anecdotes and thoughts in the form of a “zero draft” and ask ChatGPT to organize some of the ideas into a statement outline. In later stages of revisions, ChatGPT could improve concision, refine grammar, or adjust tone. Because ChatGPT has been known to occasionally “hallucinate” false information,6  writers will still need to verify all output to ensure the integrity of representations of their personal beliefs and experiences.

The core challenge for medical educators involved in the resident selection process revolves around the concern that content generated or modified by ChatGPT may lack authenticity or specificity to the applicant it represents. Aside from relying on honor system statements, there is no accurate method yet to discern the use of GAI in text-based responses. Consequently, some educators may be concerned about the potential homogenization of applications due to the use of ChatGPT, making it challenging to assess for “fit.”4 

However, it is worth noting that there have always been ways for medical students to enhance their applications beyond their own efforts. For example, students are often advised to seek input from mentors who can revise personal statements and other application components.7  Additionally, students can use grammar software or even hire specialized application consultants, although the latter has raised objections because it would disadvantage applicants who cannot afford these services.8  In this context, GAI is not more objectionable than these existing methods through which students refine their application materials. In fact, freely or cheaply accessible generative AI could level the playing field, aiding those who might not have access to traditional resources. Given these considerations, we believe that medical students should not be prohibited from using GAI such as ChatGPT to improve their applications. At a time when artificial intelligence is poised to streamline a plethora of tasks within medicine, selectively barring its use for residency applications seems out of step with contemporary realities.9 

Moreover, those who fear that ChatGPT will increase the generic nature of students’ answers should acknowledge that even prior to the advent of GAI, most application responses already harbored similar themes. For example, one linguistic analysis of personal statements for an internal medicine program found that 95% could be classified into only 5 themes (memorable patients, research/academic, family inspirations, appeals of the program, and health care as public policy).10  Similarly, a textual analysis of general surgery personal statements found that 74% could be classified into 8 topics.11 

To determine whether GAI should be prohibited from residency applications, it is essential to define what exactly is under scrutiny in each component of the application and selection process.12  For example, what is the purpose of a personal statement? Some may argue that it might be used as a proxy for the applicant’s own writing ability. If creative nonfiction writing skills could be shown to predict resident performance (in medical writing or patient communication skills, for example), educators might be motivated to employ timed writing assessments in controlled settings since presently there is no limit to how many revisions applicants might make with the help of others. If instead the purpose of the personal statement is to communicate detailed information about applicants that may not be found in the remainder of the application, one should consider the potential for GAI to help facilitate this goal.13  The residency application process, with or without the availability of GAI, has always faced challenges in rendering holistic and meaningful portrayals of applicants.12  Each year, innovative instruments emerge to improve the way that we gauge resident interest and aptitude, from standardized interviews14  to technical skill assessments.15  Interestingly, these novel evaluations may be harder to game with GAI or similar technological aids.

In conclusion, while the use of GAI in residency applications might stir unease among educators involved in resident selection, we advocate against any overarching bans on its use. Instead, we urge applicants and selection committees to responsibly navigate this new era of AI. In the Box, we provide specific recommendations for how each party could approach the generation and evaluation of text-based materials for the application, considering the widespread availability of GAI. Examples of ways in which applicants can use GAI include using programs like ChatGPT to inspire, clarify, or reorganize ideas. Examples of ways in which educators could mitigate the impact of malicious uses of GAI include comparing the ideas conveyed in an applicant’s text answers against what is presented in the rest of the application package and interview. Educators may also reflect on the design of different residency application services as they may vary in their use of text-based questions as compared to other materials.

Box Suggested Dos and Don’ts of Using Generative Artificial Intelligence (GAI) in the Application Process
Applicants5 
  1. Don’t rely on GAI to create your entire personal statement.

  2. Do engage with GAI to brainstorm ideas relevant to your own unique experiences and attributes to assist with initial draft development.

  3. Do use GAI to suggest revisions to your content once a draft exists.

  4. Do utilize GAI to improve the readability of your submission as you near a final draft.

Educators11,17 
  1. Don’t assume that an applicant did not use GAI to craft their application.

  2. Do recognize that the currently available methods for detecting AI-generated text are imperfect.

  3. Do examine other aspects of the application to corroborate the information presented in text-based answers.

  4. Do question the purpose of the personal statement and how it relates to the holistic selection process.

While research advances may one day allow us to accurately detect AI-generated work,16  education researchers are encouraged for now to remain curious about the effects of GAI on the residency application and selection processes. Examples of future research projects could include qualitatively examining text-based application materials in the era of GAI, compared to those submitted in prior years or anonymously surveying applicants on their use of GAI in preparing applications.

1. 
van de Ridder
JMM,
Shoja
MM,
Rajput
V.
Finding the place of ChatGPT in medical education
.
Acad Med
.
2023
;
98
(
8
):
867
.
2. 
Johnstone
RE,
Neely
G,
Sizemore
DC.
Artificial intelligence software can generate residency application personal statements that program directors find acceptable and difficult to distinguish from applicant compositions
.
J Clin Anesth
.
2023
;
89
:
111185
.
3. 
Patel
V,
Deleonibus
A,
Wells
MW,
Bernard
SL,
Schwarz
GS.
The plastic surgery residency application in the era of ChatGPT: a personal statement generated by artificial intelligence to statements from actual applicants
.
Ann Plast Surg
.
2023
;
91
(
3
):
324
-
325
.
4. 
Woodfin
MW.
The personal statement in the age of artificial intelligence
.
Acad Med
.
2023
;
98
(
8
):
869
.
5. 
UCAS
.
A guide to using AI and ChatGPT with your personal statement
.
6. 
Alkaissi
H,
McFarlane
SI.
Artificial hallucinations in ChatGPT: implications in scientific writing
.
Cureus
.
2023
;
15
(
2
):
e35179
.
7. 
Jones
D,
Pittman
JR
Jr
,
Manning
KD.
Ten steps for writing an exceptional personal statement
.
J Grad Med Educ
.
2022
;
14
(
5
):
522
-
525
.
8. 
Johnstone
RE,
Vallejo
MC,
Zakowski
M.
Improving residency applicant personal statements by decreasing hired contractor involvement
.
J Grad Med Educ
.
2022
;
14
(
5
):
526
-
528
.
9. 
Association of American Medical Colleges
.
ERAS Residency Applicants FAQ. Students & Residents
.
10. 
Osman
NY,
Schonhardt-Bailey
C,
Walling
JL,
Katz
JT,
Alexander
EK.
Textual analysis of internal medicine residency personal statements: themes and gender differences
.
Med Educ
.
2015
;
49
(
1
):
93
-
102
.
11. 
Ostapenko
L,
Schonhardt-Bailey
C,
Sublette
JW,
Smink
DS,
Osman
NY.
Textual analysis of general surgery residency personal statements: topics and gender differences
.
J Surg Educ
.
2018
;
75
(
3
):
573
-
581
.
12. 
Bowe
SN,
Bly
RA,
Whipple
ME,
Gray
ST.
Residency selection in otolaryngology: past, present, & future
.
Laryngoscope
.
2023
;
133
(
suppl 11
):
1
-
13
.
13. 
White
BAA,
Sadoski
M,
Thomas
S,
Shabahang
M.
Is the evaluation of the personal statement a reliable component of the general surgery residency application?
J Surg Educ
.
2012
;
69
(
3
):
340
-
343
.
14. 
Schenker
ML,
Baldwin
KD,
Israelite
CL,
Levin
LS,
Mehta
S,
Ahn
J.
Selecting the best and brightest: a structured approach to orthopedic resident selection
.
J Surg Educ
.
2016
;
73
(
5
):
879
-
885
.
15. 
Carlson
ML,
Archibald
DJ,
Sorom
AJ,
Moore
EJ.
Under the microscope: assessing surgical aptitude of otolaryngology residency applicants
.
Laryngoscope
.
2010
;
120
(
6
):
1109
-
1113
.
16. 
Verma
V,
Fleisig
E,
Tomlin
N,
Klein
D.
Ghostbuster: detecting text ghostwritten by large language models.
BAIR
.
Published online November 13, 2023. https://bair.berkeley.edu/blog/2023/11/14/ghostbuster/
17. 
Association of American Medical Colleges
.
Holistic review
.

Disclosure: The views expressed herein are those of the author(s) and do not necessarily reflect the official policy or position of the Defense Health Agency, the Brooke Army Medical Center, the Department of Defense, nor any agencies under the US Government.