Background

The Medical School Performance Evaluation (MSPE) is an important factor for application to residency programs. Many medical schools are incorporating recent recommendations from the Association of American Medical Colleges MSPE Task Force into their letters. To date, there has been no feedback from the graduate medical education community on the impact of this effort.

Objective

We surveyed individuals involved in residency candidate selection for internal medicine programs to understand their perceptions on the new MSPE format.

Methods

A survey was distributed in March and April 2018 using the Association of Program Directors in Internal Medicine listserv, which comprises 4220 individuals from 439 residency programs. Responses were analyzed, and themes were extracted from open-ended questions.

Results

A total of 140 individuals, predominantly program directors and associate program directors, from across the United States completed the survey. Most were aware of the existence of the MSPE Task Force. Respondents read a median of 200 to 299 letters each recruitment season. The majority reported observing evidence of adoption of the new format in more than one quarter of all medical schools. Among respondents, nearly half reported the new format made the MSPE more important in decision-making about a candidate. Within the MSPE, respondents recognized the following areas as most influential: academic progress, summary paragraph, graphic representation of class performance, academic history, and overall adjective of performance indicator (rank).

Conclusions

The internal medicine graduate medical education community finds value in many components of the new MSPE format, while recognizing there are further opportunities for improvement.

One of the primary responsibilities of a residency program director (PD) is to recruit talented residents. This process has become more burdensome as the number of medical students applying for each available residency position has increased dramatically over the last 5 years. The average number of residency applications per graduating medical student was 60.5 in 2018 versus 48.8 in 2014,1  representing a 24% increase. National Resident Matching Program data reveal that for internal medicine, the largest specialty in the Match, categorical programs must rank 7.3 applicants on average to fill every spot.2  The selection process involves reviewing a large volume of quantitative and qualitative assessment material, including transcripts, US Medical Licensing Examination scores, personal statements, letters of recommendation, and the Medical Student Performance Evaluation (MSPE), informally known as the “Dean's letter.”

In 2014, the Association of American Medical Colleges charged an MSPE Task Force with revisiting the MSPE 2002 recommendations and addressing the following issues: (1) inconsistencies in content, language, and terminology; (2) length of letters (too long to be useful yet insufficiently transparent to convey an accurate sense of student performance); and (3) missed opportunities to use the letter to highlight salient experiences and attributes not found elsewhere in the application.3 

In 2016, the Association of American Medical Colleges MSPE Task Force released its recommendations, which addressed 6 principles.4  The revised MSPE should provide:

  • supplemental value to the information already provided in the Electronic Residency Application Service application, transcripts, and letters of recommendation;

  • a level of standardization and transparency that facilitates the residency selection process;

  • comparative information on applicants;

  • information about applicants' standing in the competencies required to be successful in residency;

  • increased opportunity for PDs to assess applicants holistically in the preinterview stage; and

  • qualitative and quantitative assessments of applicants in an easy-to-read format.

Since that time, little has been published about this topic. In 2017, Hook and colleagues5  evaluated MSPEs from 113 of 147 US medical schools (77%) and concluded that the majority had incorporated the 2016 MSPE Task Force recommendations, although not all suggestions were adopted uniformly. For example, while more than 95% of US medical schools had decreased their page count, just under 70% presented school-wide comparative performance data.

Medical schools expend an enormous amount of time and resources writing these letters; despite this, there is no literature to our knowledge describing how the newly formatted MSPE is used in practice. To understand this better, we developed a survey to educate and explore how individuals involved in the internal medicine residency selection process utilize the MSPE.

From March through April 2018 a survey (provided as online supplemental material) was distributed to the Association of Program Directors in Internal Medicine listserv, which comprises 439 internal medicine programs.6  The listserv comprises 4220 individuals, including PDs (9%, 363 of 4220), associate PDs (22%, 921 of 4220) and program administrators (22%, 931 of 4220), among others, and is a major source of communication and information dissemination for residency programs across the country.

A total of 140 responses were received (3.3% of individuals on the Association of Program Directors in Internal Medicine listserv). All respondents confirmed that as part of their responsibilities, they review applicants' MSPEs. Of these respondents, 63% (85 of 134) had 5 or more years of experience. Respondents included representatives from across the country (16% [19 of 119] Midwest, 42% [50 of 119] Northeast, 15% [18 of 119] Southeast, 5% [6 of 119] Southwest, and 22% [26 of 119] West). The majority were PDs (46%, 62 of 134) and associate PDs (33%, 44 of 134); the survey did not request respondents to identify their program. Respondents reported reading a median of 200 to 299 MSPEs per recruitment season.

The majority of respondents (81%, 108 of 134) were aware of the existence of the MSPE Task Force, although awareness of changes to the MSPE was higher than awareness of the task force itself. In estimating the percentage of schools that adopted the new guidelines among the MSPEs they had reviewed, respondents' perceptions were mixed (table), although the majority reported that the adoption of the new format was evident. When asked how the new format of the MSPE influenced their decision-making about a candidate, 42% (49 of 118) of respondents reported that the new format made the MSPE “more important in terms of decision-making about an applicant” than it had in the past.

table

Adoption of New Medical School Performance Evaluation (MSPE) Formata

Adoption of New Medical School Performance Evaluation (MSPE) Formata
Adoption of New Medical School Performance Evaluation (MSPE) Formata

When asked to consider the influence of each portion of the new MSPE format in terms of their decision-making process, the following were cited as being the most influential sections: academic progress, summary paragraph, graphic representation of class performance, academic history, and overall adjective of performance indicator (rank; figure).

figure

Perceived Influence of Revised Medical School Performance Evaluation Subsections on the Reviewer (N = 123)

figure

Perceived Influence of Revised Medical School Performance Evaluation Subsections on the Reviewer (N = 123)

Close modal

Lastly, we asked respondents 2 open-ended questions to better understand how the community of reviewers thought the MSPE could be improved. Responses were tabulated and organized by concepts. Themes that emerged included the desire for greater transparency and honesty in the letter, the inclusion of more comparative data, and the continued adoption of the recommendations by all medical schools. The MSPE reviewers desire information on a candidate's areas in need of improvement, as befitting a letter of evaluation rather than a letter of recommendation. Finally, MSPE readers sought more standardization in the letters they review, in line with task force goals.

Our survey of MSPE users in the internal medicine community reveals that the adoption of the new MSPE format is widespread, though not yet universal, and that readers think the new format is moving toward better representing student applicants' global medical school performance. As a point of reference, in 2018, 81% of program directors nationwide cited the MSPE as one of the most important factors used to select candidates to interview.7 

Many PDs reported reviewing hundreds of applications each recruitment season. The lack of a standard format to the MSPE adds to the tremendous burden when reviewing large numbers of applications, which the recommendations attempt to address. While the new MSPE format was perceived as more important in terms of decision-making by a little under half of respondents, the inclusion of more comparative data and an explicit discussion of a student's weaknesses and potential for improvement were desired. Our findings also suggest that we must continue to be true to the nature of the MSPE as a performance evaluation and not a letter of recommendation.

This study is limited by a low survey response rate and singular focus on internal medicine. Both factors may limit generalizability to the medical education community. Second, the survey was developed and tested internally, not on a wider audience. When using this survey in the future, minor modifications will be made based on the feedback from this study.

Whether through the MSPE Task Force or via another process, the MSPE should continue to be refined to meet the needs of its stakeholders, translating to the need for ongoing dialogue that must continue to take place between those advocating for their students in undergraduate medical education and those receiving the students in graduate medical education. While this study represents the first formal inquiry of end users of the new MSPE format, more investigation is needed, beginning with understanding the perceptions of readers in disciplines beyond internal medicine. There is also a need to understand the barriers to full implementation of the MSPE recommendations from the MSPE writer's perspective. Additionally, while greater transparency is desired by end users, it is important to better understand the unintended consequences of transparency in terms of Match results.

The MSPE continues to be an influential component in the residency application process. The internal medicine graduate medical education community, primarily PDs and associate PDs, finds value in many of the components of the new MSPE format, including academic progress, the summary paragraph, and graphic representation of class performance.

1
Association of American Medical Colleges
.
Electronic residency application service (ERAS)
.
2018
. ,
2019
.
2
National Resident Matching Program
.
Results and data: 2018 main residency match
. ,
2019
.
3
Giang
D,
Jones
L,
Young
G.
MSPE changes: what can program directors look for on October 1, 2017
.
2017
. ,
2019
.
4
Association of American Medical Colleges
.
Recommendations for revising the Medical Student Performance Evaluation (MSPE)
.
2017
. ,
2019
.
5
Hook
L,
Salami
AC,
Diaz
T,
Friend
KE,
Fathalizadeh
A,
Joshi
ART.
The revised 2017 MSPE: better, but not “outstanding.”
J Surg Educ
.
2018
;
75
(
6
):
e107
e111
. .
6
American Board of Internal Medicine
.
Number of programs and residents
. ,
2019
.
7
National Resident Matching Program
.
Results of the 2018 NRMP Program Director Survey
. ,
2019
.

Author notes

Editor's Note: The online version of this article contains the Medical School Performance Evaluation 2018 Survey.

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

The authors would like to thank Doreen Olvet, PhD; Jeffrey Bird, MA; Saori Wendy Herman, MLIS, AHIP; Joanne Willey, PhD; and Krista Paxton for their assistance with manuscript preparation.

Supplementary data