Each year, thousands of medical students apply for residency training programs in the United States. Students compile their rank lists based on review of program websites, discussions with mentors, peer interactions, and online through blogs and the Student Doctor Network.1 Impressions shared in this way can be outdated and biased. In 2014, Doximity, an online social media platform for medical professionals, attempted to address this lack of reliable data to aid in residency program selection with the release of the Residency Navigator, which is described on its website as “a transparent look into US medical residency programs.”2 In this perspective, we address the accuracy and the source of information represented on the Residency Navigator, and suggest improvements to offer reliable data on residency programs.
The Residency Navigator allows medical students to learn more about residency programs from 28 different specialties.3 Research suggests that it is frequently used by medical students and changes their application decisions.4,5 A webpage for each residency program contains both qualitative and quantitative data about the program. Specifically, Doximity administers a “satisfaction survey” of recent alumni and shares those responses in short and long comment forms, and a “reputation survey” in which physicians list the 5 programs nationwide that provide the best clinical training within their specialty.3
The process of reputation ranking has raised concerns among educators in a variety of fields of training, in part because it favors larger, more established programs with a strong alumni voice. Wilson and colleagues compared reputation ranking to outcomes (board pass rate and alumni publications) for 218 surgical programs and found only a moderate association, cautioning trainees about reliance on reputation ranking.6 Ashack and colleagues compared the rankings for dermatology on Doximity with those on another website that accounts for scholarly publication and found that the rankings only overlapped 50%.7 Medical students are aware of the weaknesses of reputation ranking; in one survey over 50% had doubts about its accuracy. Despite this, 60% of those students stated that the Doximity reputation rankings influenced their applications to residency programs.8 The Residency Navigator provides quantitative outcomes for each program in addition to reputation ranking, which includes but is not limited to research output, board pass rates, percent board certified, and subspecialty percentages.3 However, in our experience, these values are not always accurate.
In July 2017, our program noticed that the board certification rate and subspecialist (fellowship trained) percentage for our alumni represented on Doximity were both lower than we expected based on our own data. We reached out to the company and requested information about the numbers they were using to generate this data and initiated an effort to understand the rates reported for our program. Initial discussions revealed that some physicians on Doximity had been wrongly attributed to our program and that other data had not been updated according to the latest publically available data. Doximity promptly changed our program's data online when we pointed this out, but would not provide the underlying raw data we requested. Given the inaccuracy of the first set of data that we received and the inability to review the underlying data, there is sufficient reason to question the integrity of the quantitative information that is being publicly shared for all programs, including ours.
As educators, we agree wholeheartedly with the need to increase transparency around residency programs and provide information to medical students as they make choices about training. We also believe in transparency in research and accountability in data collection, principles to which any medical outcomes researcher would be subject. Many training programs do not share information about program performance that medical students might find useful in the residency selection process. Doximity has presented its Residency Navigator Tool to fill that important void. However, in doing so, it should adhere to standards of research integrity and transparency in publishing data. We suggest that programs be allowed annually to review the raw data that comprises objective metrics before they are published on Residency Navigator, to ensure accuracy when describing available training programs. Ultimately, residency programs cannot control what is published on social media, but this process would allow programs to provide up-to-date information to improve the accuracy of the tool.
At the same time, programs should track and publish their own data for these quality metrics and other program-specific metrics, including qualitative feedback from recent graduates on their residency websites. It can be difficult for programs to quantify all of the important factors that medical students take into account when selecting a program, such as faculty involvement, patient variety, or resident culture.9 However, programs can report their own research output, board pass rates, percent board certified, and subspecialty percentages.
Ultimately, there is a need for independent scrutiny of programs, such as that provided by Doximity. However, until there is more robust competition in the marketplace for this service, the risk of inaccurate reporting is real. The Association of American Medical Colleges has introduced more robust tools for researching residency programs online.10 Large-scale efforts like this can provide a check on online platforms and serve as a crucial step in providing transparency and accurate data to potential trainees.