Context.—

Acquiring objective, timely, and comprehensive feedback on resident diagnostic performance is notoriously difficult.

Objective.—

To implement a custom software application (Resident Case Tracker) to improve evaluative diagnostic analysis for residency programs.

Design.—

Residents and faculty use a graphical user interface with restricted access to their own cases and evaluations. For each sign-out, residents enter their diagnoses and comments for each case. Faculty are provided a sign-out queue to review the resident diagnosis and select their level of agreement alongside optional comments. After sign-out, residents can review the agreement level and comments for each case, overall sign-out statistics, and organ-specific performance, and they have the option of opening and reviewing groups of cases by agreement status. A sign-out evaluation is automatically generated and stored alongside additional reports. Administrative access allows privileged users to readily review data analytics at both an individual and residency-wide global level.

Results.—

A marked increase in completed evaluations and feedback was noted in the initial 36 months of implementation. During a 3-year academic period, faculty completed individual feedback on 33 685 cases and 1073 overall sign-out evaluations.

Conclusions.—

Resident Case Tracker is an invaluable tool for our residency program and has provided unparalleled feedback and data analytics. Throughout residency, trainees have access to each completed sign-out, with the ability to learn from discrepant cases while also monitoring improvements in diagnostic acumen over time. Faculty are able to assess resident milestones much more effectively while more readily identifying residents who would benefit from targeted study.

For nearly a decade, pathology residency programs have used the Accreditation for Graduate Medical Education milestones with the intent to provide trainees with objective documentation on their progression through key knowledge requisites.1  Training leadership, such as the Program directors (PD) and Clinical Competency Committee (CCC) members, typically incorporate faculty feedback and resident self-reporting when evaluating a resident's level of progression. Nonetheless, this feedback may lack objectivity and/or timeliness. Many faculty evaluations may not be completed until weeks—if not months—after a resident interaction. By then, memory of a particular encounter may have significantly faded. Predictably, evaluations are filled with ubiquitous comments, such as “good job,” “at appropriate PGY-level,” and “continue to read about your cases.”

Our residency program has historically attempted to obtain more objective and timely data about resident diagnostic acumen through the use of paper faculty-resident discrepancy sheets. The discrepancy sheets contained rows to manually document case numbers with corresponding checkboxes for the level of faculty agreement and an additional area for comments. The paperwork was completed by faculty at sign-out and subsequently returned to program leadership. Unsurprisingly, this methodology resulted in the herculean task of sifting through many sheets of paperwork annually. Furthermore, compliance was suboptimal given the relatively time-consuming and manual nature of the task.

Hence, we sought to streamline and automate this critical feedback through the development of a software application called Resident Case Tracker (RCT). The RCT is used on a daily basis in our department for resident assessment, diagnostic accuracy analysis, and myriad automatically generated reports for program leadership.

This project was submitted to the institutional research committee of Brooke Army Medical Center and deemed exempt for review by the Institutional Review Board. The RCT was implemented using the Visual Basic for Applications framework within Microsoft Access (Microsoft Corporation). Each resident and faculty member is provided with a user name and password for system access. Residents and faculty sign in to a password-protected graphical user interface front end with restricted access to their own cases and evaluations. All data are stored in an encrypted and secure backend database with restricted access. Decryption is accomplished by the front-end graphical user interface upon successful login by an authorized user. Although encrypted, no personally identifiable information is allowed within the database.

During case preview, residents enter their assigned faculty and range(s) of case numbers to create a new sign-out record (Figure 1 and Supplemental Figure 1 [Supplemental Figures 1 through 6 are found in the supplemental digital content at https://meridian.allenpress.com/aplm in the June 2022 table of contents]). The resident opens each individual case record to input their previewed diagnoses, comments, and potential questions (Figure 2). The case record may be opened through a summary list of all cases in the sign-out (Supplemental Figure 2) or by opening an individual case via embedded search functionality (Figures 1 and 2). The individual record has multiple fields that the resident can edit (Figure 2). In our training program, the resident enters their previewed diagnosis into the laboratory information system (Cerner CoPathPlus) and then copies their diagnosis into the “Resident Diagnosis” field of RCT. A “Faculty Comments” field is locked to resident editing but will display comments from faculty after their review. A “Case Status” field allows upper-level residents to notify faculty if they have ordered levels/stains on the case or are consulting with a subspecialist on the case. Upper-level residents (postgraduate year 4 [PGY-4]) are provided a “Sign-out” button to simulate signing out a case after they have completed their full workup and finalized the case. Once the case is “signed out” the record is locked to any further editing by the PGY-4 resident. A “Histology Quality Assurance” field allows both residents and faculty to comment on slide quality issues. This feedback is automatically compiled by the software and reported to the histology laboratory supervisor.

Figure 1

Resident main screen. The main screen provides access to all portions of the software, including general sign-out/frozen evaluations, sign-out history and organ-specific performance, and the ability to create a new sign-out, as well as the ability to search individual case records.

Figure 1

Resident main screen. The main screen provides access to all portions of the software, including general sign-out/frozen evaluations, sign-out history and organ-specific performance, and the ability to create a new sign-out, as well as the ability to search individual case records.

Close modal
Figure 2

Individual case review. The resident is able to input their previewed diagnosis and case status (eg, pending levels). Faculty select their level of agreement via a dropdown selection. Users are able to open the next individual record using embedded search functionality.

Figure 2

Individual case review. The resident is able to input their previewed diagnosis and case status (eg, pending levels). Faculty select their level of agreement via a dropdown selection. Users are able to open the next individual record using embedded search functionality.

Close modal

During sign-out, faculty log in to RCT and are presented with a queue of individual cases requiring assessment (Supplemental Figure 3). Faculty can review each case directly from their pending queue or open each record individually via search functionality. For each case record, faculty evaluate the resident diagnosis and select from a dropdown box whether they “Agree,” “Partially Agree,” or “Disagree,” and they have the ability to input their own comments (Figure 2). Once a faculty member edits any portion of the record, it is locked to any further input or changes by the resident. During typical use, faculty will only open and comment on those cases for which they partially agree or disagree. After all the discrepant cases are recorded, RCT provides the efficient ability to agree with the remainder of the cases in a particular sign-out with a single button.

When a new sign-out is created by a resident, RCT automatically generates an evaluation for faculty completion that is stored alongside other archived data (Supplemental Figure 4). Faculty are reminded of the number of pending evaluations needing completion upon login (Supplemental Figure 3). The software also provides discrepancy statistics for the faculty member to review while completing any evaluation(s) (Figure 3). If desired, the faculty member may also open a summarized list of all the discrepant cases in a particular sign-out, including the resident diagnosis, faculty comment, and discrepancy level. Once an evaluation is completed, RCT notifies the resident that an evaluation is available for review and acknowledgment (Figure 1).

Figure 3

Sign-out history. For each sign-out, faculty and residents are able to see discrepancy statistics and open a categorized list of cases based on discrepancy level. Users are also able to review the evaluation for that sign-out, review all “interesting” cases, and add additional cases to the sign-out as needed.

Figure 3

Sign-out history. For each sign-out, faculty and residents are able to see discrepancy statistics and open a categorized list of cases based on discrepancy level. Users are also able to review the evaluation for that sign-out, review all “interesting” cases, and add additional cases to the sign-out as needed.

Close modal

After sign-out, residents can review the agreement level and comments for each case, as well as the overall sign-out agreement percentages, and are provided the option of reviewing groups of cases by agreement status (Figure 3). The RCT autonomously classifies each case into its respective organ system/subspecialty class based on diagnostic keywords. This allows residents to review organ-specific performance for a particular sign-out or for multiple sign-outs covering any date range. The RCT archives all relevant data, and all users have complete access to their entire sign-out records, statistics, and evaluations for any time period (Figure 3).

Frozen section evaluations are also recorded in RCT. After a frozen section is completed, the resident logs in to RCT and generates a real-time frozen section evaluation. The faculty member selects their assessment of the resident via dropdown boxes on both technical performance and microscopic interpretation of the frozen section. The software keeps a record of all frozen sections performed by the resident and evaluations throughout the course of residency.

Additional report-generating functionality is provided to privileged administrative users (eg, PDs and CCC members; Supplemental Figure 5). The administrative portion of RCT allows these particular users to review sign-out statistics/discrepant cases (Figure 4), a variety of evaluation summaries (Figure 5 and Supplemental Figure 6), and organ system performance at both an individual and a residency-wide level for any date range (Figure 6). Evaluative privileged users are also able to efficiently review historical resident performance on frozen sections. A “Faculty Discrepancy Summary” lists each faculty member, how many individual cases they have reviewed, and their percentage of “Agrees,” “Partial Agrees,” and “Disagrees” given over any date range. The administrative section also notifies the PDs of the number of evaluations requiring their review and acknowledgment. Finally, the administrative section provides the ability to add and remove faculty/resident users and reset passwords and provides a list of number of incomplete sign-out and individual case records for each faculty and resident user.

Figure 4

Overall discrepancy report. The discrepancy report lists the total number of cases evaluated for a particular resident as well as the absolute number and percentages of discrepancies for any date range. Lists of the discrepant cases categorized by discrepancy level can be reviewed using the “Open” button.

Figure 4

Overall discrepancy report. The discrepancy report lists the total number of cases evaluated for a particular resident as well as the absolute number and percentages of discrepancies for any date range. Lists of the discrepant cases categorized by discrepancy level can be reviewed using the “Open” button.

Close modal
Figure 5

Administrative individual sign-out review. This functionality provides more granular detail by providing a list of every sign-out for a particular resident for any date range. For each sign-out, faculty comments are displayed, as are the discrepancy statistics for that particular sign-out. If necessary, the individual case lists and records can be reviewed by using the “Open” buttons under the “Signout Statistics.”

Figure 5

Administrative individual sign-out review. This functionality provides more granular detail by providing a list of every sign-out for a particular resident for any date range. For each sign-out, faculty comments are displayed, as are the discrepancy statistics for that particular sign-out. If necessary, the individual case lists and records can be reviewed by using the “Open” buttons under the “Signout Statistics.”

Close modal
Figure 6

Residency-wide subspecialty performance. Administrative users are able to generate sign-out discrepancy performance for each organ system for any date range on either an individual resident or a global residency basis.

Figure 6

Residency-wide subspecialty performance. Administrative users are able to generate sign-out discrepancy performance for each organ system for any date range on either an individual resident or a global residency basis.

Close modal

A significant increase in the “evaluation capture percentage” (ECP) was observed in the initial 36 months of implementation. The “sign-out ECP” is defined as the number of completed sign-out evaluations divided by the total number of sign-outs. The “case ECP” is defined as the number of completed individual case evaluations divided by the total number of individual cases. Prior to the implementation of RCT in the academic years 2014–2017, individual case feedback was provided by faculty on 334 of 102 230 cases (0.3% case ECP) and 710 of 1873 individual sign-outs (37.9% sign-out ECP). For the academic years 2017–2020 after the implementation of RCT, individual case feedback was provided by faculty on 33 685 of 89 008 cases (37.9% case ECP) and 1073 out of 1482 individual sign-outs (72.4% sign-out ECP). This represents a 12 533% (∼126-fold) and 91% increase in the case and sign-out ECPs, respectively.

Timely and consistent feedback is a critical component of resident education. With prompt feedback, evaluators recollect events more clearly and accurately, and learners can implement immediate corrections.2  Nonetheless, multiple factors may hinder feedback efficiency. Hesketh and Laidlaw3  identified a number of barriers that prevent effective evaluations within the medical learning environment. These include “fear of upsetting the trainee” and “generalized feedback lacking specific and precise observations.” “Fear of upsetting the trainee” can be a particularly strong barrier in the military environment because there is a high probability a particular trainee will be a colleague in the near future. Faculty time constraints are also a commonly cited reason for delay in feedback.4 

In our residency program, we have attempted to overcome these barriers with the use of real-time, granular, and objective individual case feedback during sign-out. Historically, this was accomplished with the use of rather cumbersome paper discrepancy sheets, with predictable suboptimal faculty compliance. In our experience, feedback reluctance is lessened and quality of evaluations enhanced when faculty are provided with more specific, objective, and limited evaluative choices on an individual case basis. However, the pursuit of more granular feedback results in a separate set of issues, including copious amounts of paperwork to manually review, lost paper evaluations, and reduced faculty completion. Transitioning evaluative feedback to electronic methods has numerous benefits, including less cost and administrative burden.5  Consequently, RCT was created to provide a venue for granular feedback while taking advantage of the benefits of electronic methodologies.

Implementation of the software has reduced evaluation loss and increased faculty compliance and accountability. After implementation, there was a significant improvement in the amount of captured feedback, as evidenced by the 91% increase in overall sign-out evaluations and an approximately 126-fold increase in individual case evaluations captured in the 3 academic years after RCT was introduced. Traditional electronic evaluation systems require the program coordinator or director to manually generate evaluations based on resident schedules. These systems also require faculty to sign in to a separate evaluation system sometimes weeks or months after the sign-out was completed. In contrast, RCT is integrated into the daily pathologist workflow and automatically generates evaluations for completion prior to the faculty member even meeting with the resident. Thus, faculty are able to readily complete individual case feedback throughout the course of the sign-out and are provided with an overall sign-out evaluation ready for immediate completion after the sign-out has ended.

Additionally, PDs and CCC members are able to review all data with autonomously generated analytic reports (Figures 4 through 6 and Supplemental Figure 6). The increased feedback has provided a multitude of additional data points for PDs and CCC members to consider when evaluating resident progression. Consequently, a single negative evaluation can be interpreted within a more judicious context. Our CCC uses RCT both before and during biannual resident evaluations. If there is concern a trainee is underperforming based on overall evaluation summaries, the CCC will review individual case discrepancies and make recommendations for targeted study and remediation for that particular resident. Residents also directly benefit from increased data collection by having access to every completed sign-out with the ability to learn from discrepant cases while also seeing improvements in diagnostic acumen over time (Figure 3).

Although the amount of data has increased, the effort required to analyze these data has markedly decreased. Time-intensive manual review and statistical calculations of discrepant cases are no longer required because RCT automatically generates reports with statistical calculations for any date range. Also, faculty comments for any date range are conveniently collected into a single report for efficient review by the CCC and PDs. The RCT also provides data on global trends in the holistic performance of the residency program. We are able to efficiently monitor subspecialty diagnostic accuracy through the report-generation capability of RCT (Figure 6). If residents are weakly performing in any particular subspecialty, additional lectures and slide sessions focusing on these areas can be implemented to address deficits.

Immediately following residency, a large proportion of military pathology residents will serve as general pathologists. Fellowship training opportunities are limited by individual military service needs, with only a small number of subspecialty training opportunities typically offered each year. Accordingly, military residents must be prepared to function more independently directly from residency training than their civilian colleagues, who may complete 1, 2, or more fellowships after graduation before entering practice. Accelerated graduated responsibility is a necessity in this environment. By the PGY-4 level, residents are expected to take full ownership of surgical pathology cases, including ordering levels and special strains and consulting with subspecialists as needed. The RCT provides a simulated “junior faculty experience” wherein a PGY-4 resident retains this ownership of cases and “signs out” the case within the software. Once the case is completed in RCT, the record is electronically “signed” by the resident and is transferred to the attending faculty for review. After the resident has finalized the case, the record is locked to further editing by the resident. This provides a sense of permanence and import to their diagnosis through simulation of the initial sign-out experience of a junior faculty member.

Resident Case Tracker was designed as a software program separate from the laboratory information system (LIS) for several reasons. Although many of the current pathology LIS software options allow for discrepancy tracking of users, this feedback essentially becomes a component of the permanent medical record. Albeit unlikely, this may have medicolegal ramifications if the record is ever parsed during litigation. Additionally, the data analysis functionality within commercial LIS systems for resident education is at a much less sophisticated level than that provided by RCT. One of the drawbacks of a separate system, however, is the need to have 2 separate programs open while reviewing cases. Nonetheless, the additional time required to use RCT as a separate program is minimal. Residents are able to readily “copy and paste” their diagnoses from the LIS into RCT during previewing.

Faculty time constraints are also taken into account in the design of RCT. Unlike an integrated discrepancy tracker within the LIS, RCT supplies the ability for faculty to provide feedback on discrepant cases with automatic agreement on the remainder of the cases in the sign-out. User feedback on the software has shown that use of RCT only adds about 10 to 15 minutes of additional time to the average sign-out. With an average volume of approximately 30 000 to 35 000 routine surgical pathology cases annually, Brooke Army Medical Center is the largest medical facility in the Department of Defense and receives specimens from more than 90 military treatment facilities throughout the world. Before and after the software evaluation period, 2 surgical pathology staff were assigned to cover the general surgical pathology service each day and were responsible for approximately 40 to 60 cases each. The required time commitment to use the software is based on the individual feedback philosophy of the staff. Some staff pathologists give detailed feedback on each individual case, whereas others give feedback solely via the dropdown boxes for level of agreement. When only the dropdown boxes are used, only a few additional minutes are added per sign-out. Conversely, detailed comments on each discrepant case may add an additional 20 to 30 minutes to the sign-out.

Staff and resident users have described RCT as being “reliable and very user friendly” and “a great tool to provide feedback to residents.” One particular staff member (who provides residents with very detailed feedback) reports that use of the software adds on average “about 20 minutes per signout.” However, they also report that they “would easily spend more time providing feedback without RCT” because they “would still take the time to convey important feedback to residents,” but it would be through less efficient methodologies, such as email. Other feedback is that RCT is a “great tool” and a “huge improvement from what was used to give feedback/assess residents at the institution I trained.” A member of the CCC appreciates the administration and data analytic functionality that “allows select individuals to look up essential data for monitoring progress.” Multiple residents have also provided positive feedback on the ability to have an archived record of all of their sign-outs throughout residency.

Resident Case Tracker was written in the Visual Basic programming framework within Microsoft Access. There are several advantages and disadvantages to using this development platform. The overarching rationale in this design decision was due to governmental software installation restrictions. Microsoft Office and the accompanying Microsoft Access database software are nearly ubiquitous on both governmental and nongovernmental computer systems. This universality is advantageous for deployment on most computers because the purchase of additional software packages to run RCT is not required on a majority of systems. Nonetheless, Microsoft Access is not available on macOS (Apple Inc) or Linux-based operating systems and remains a potential disadvantage of this platform.

Resident Case Tracker is not intended as a surrogate for the conventional individual sign-out between resident and faculty within our program. Personal and individual instruction is still critical to learning the art of histologic interpretation. Nonetheless, RCT has been a beneficial tool as a social distancing platform during the current restrictions of the coronavirus disease 2019 (COVID-19) pandemic. During the height of the outbreak, our program suspended one-on-one sign-outs in an effort to maximize social distancing and mitigate the spread of the virus among personnel. The RCT provided the necessary tools for in-depth, individual case feedback to continue even in the absence of a conventional sign-out session.

Resident Case Tracker is a continually evolving software platform, with incorporation of additional features based on faculty and/or resident recommendation. We have recently implemented integrated frozen section functionality, which allows residents to generate a real-time frozen section evaluation for immediate faculty completion. Because feedback is most useful when bidirectional, we plan on implementing anonymous resident-to-faculty feedback functionality for each sign-out. The feedback will be compiled and presented to faculty in a summary report every 3 to 6 months.

In summary, RCT is an invaluable tool for our residency program and has provided unparalleled feedback and data analytics. Accordingly, we would like other residency programs to benefit from this software. In the near future, we will likely offer RCT on a complimentary basis through an open-source license to other training programs desiring to benefit from its ability to enhance faculty communication, objective feedback, and the overall resident experience during training.

1.
Naritoku
WY,
Alexander
CB,
Bennett
BD,
et al
The pathology milestones and the next accreditation system
.
Arch Pathol Lab Med
.
2014
;
138
(3)
:
307
315
.
2.
Ramani
S,
Krackov
SK.
Twelve tips for giving feedback effectively in the clinical environment
.
Med Teach
.
2012
;
34
(10)
:
787
791
.
3.
Hesketh
EA,
Laidlaw
JM.
Developing the teaching instinct, 1: feedback
.
Med Teach
.
2002
;
24
(3)
:
245
248
.
4.
Zehra
T,
Tariq
M,
Ali
SK,
Motiwala
A,
Boulet
J.
Challenges of providing timely feedback to residents: faculty perspectives
.
J Pak Med Assoc
.
2015
;
65
(10)
:
1069
1074
.
5.
Rosenberg
ME,
Watson
K,
Paul
J,
Miller
W,
Harris
I,
Valdivia
TD.
Development and implementation of a web-based evaluation system for an internal medicine residency program
.
Acad Med
.
2001
;
76
(1)
:
92
95
.

Author notes

Supplemental digital content is available for this article at https://meridian.allenpress.com/aplm in the June 2022 table of contents.

The author has no relevant financial interest in the products or companies described in this article.

This project was presented via platform at the 2020 United States and Canadian Academy of Pathologists Annual Meeting; March 2, 2020; Los Angeles, California.

Competing Interests

The views expressed herein are those of the author and do not reflect the official policy or position of Brooke Army Medical Center, the US Army Medical Department, the US Army Office of the Surgeon General, the Department of the Army, the Department of the Air Force and Department of Defense, or the US government.

Supplementary data