Context.—

Pathology reports are the main modality in which results are communicated to other physicians. For various reasons, the diagnosis may be qualified on a spectrum of uncertainty.

Objective.—

To examine how communication of uncertainty is an unexamined source of possible medical error. No study to our knowledge has examined pathology reports across multiple institutions. This study seeks to identify commonly used phrases of diagnostic uncertainty and their interpreted meanings by surgical pathologists and clinicians.

Design.—

Anonymous surveys were completed at 3 major US academic institutions by 18 practicing staff pathologists, 12 pathology residents, 53 staff clinicians, and 50 resident/allied health professional clinicians at 5 standard tumor boards. All participants rated percentage certainty associated with 7 diagnostic terms. Pathologists answered 2 questions related to the ability to clarify a diagnosis using a comment and comfort wording pathology reports. Clinicians answered questions on how often they read a pathology report comment, if they found the comment helpful, and how comfortable they were in reading pathology reports.

Results.—

A wide range in percentage certainty was found for each of the 7 diagnostic phrases. Both staff and resident clinicians and residents showed wide variability in interpreting the phrases. Twenty-five of 50 staff clinicians (52%) were very comfortable reading a pathology report, whereas only 4 of 53 resident clinicians (8%) were very comfortable reading a pathology report. Twenty-four of 53 staff clinicians (63%) reported always reading the comment, yet only 20 of 53 (27%) always found the comment helpful. The phrases “diagnostic of” and “consistent with” had the strongest agreement in meaning. The weakest agreement was between “suspicious for” and “compatible with.”

Conclusions.—

Efforts to standardize diagnostic terms may improve communication.

A surgical pathologist follows a basic stepwise method for rendering a diagnosis. Tissue is taken from the patient, processed into histopathology slides, and then examined for such features as tissue architecture and cellular morphology using a light microscope. The results of the pathologist's examination of histopathology slides are then reported to the clinicians in a written surgical pathology report. The primary goal of such reports is to communicate a diagnosis. For various reasons, the diagnosis may be qualified on a spectrum of uncertainty.

To communicate uncertainty regarding a diagnosis, a pathologist can insert qualifying comments into different portions of the surgical pathology report. For example, the line diagnosis can read “most consistent with adenocarcinoma.” This immediately communicates some level of uncertainty in the diagnosis by the surgical pathologist to the reader of the report. A free-text comment field on a surgical pathology report may be used to provide the pathologist with the opportunity to further discuss factors affecting a case-specific situation. Several factors could contribute to a pathologist's being unable to render an unequivocal diagnosis, including unusual histomorphology, ambiguous immunohistochemical staining results, lack of clinical information, uncertain criteria for diagnosis in medical literature, lack of experience, or a desire to avoid encumbering legal liability from a medical error.1  Because of their scientific complexity and immense heterogeneity in circumstance, pathology reports are notoriously difficult to read.2 

Further complicating communication of ambiguity is the structure of an academic training environment. Trainee pathologists often generate reports that will be edited and signed by staff pathologists. In turn, trainee clinicians will read the generated reports, but often do so separately from staff clinicians overseeing them. Thus, variation in experience, clinical context, and specialty can subtly influence the interpretation of the surgical pathology report.

Communication of uncertainty is an unexamined source of possible medical error. Although there have been studies in radiology to examine terminology used to convey diagnostic certainty in radiologists' reports,38  to date, few studies have attempted to examine surgical pathologists' objectives and attitudes in terms of the percentage certainty implied by specific phrases that serve as modifiers or adjuncts to common line surgical pathology diagnoses.1,9,10  In building upon prior studies, especially that of Bracamonte et al,9  this study seeks to identify commonly used phrases of diagnostic uncertainty and their interpreted meaning by surgical pathologists who write and clinicians who read reports at multiple academic institutions across diverse geographic regions.

Three separate institutional review board approvals were obtained at the University of Utah in Salt Lake City, Utah; Duke University in Durham, North Carolina; and Stanford in Palo Alto, California. Authors who work at each site were responsible for distribution and collection of surveys.

In phase 1 of our study, an anonymous questionnaire was given to the practicing attending pathologists and pathology residents during departmental meetings. Participation was voluntary and without compensation. Each survey had 10 identical questions, and asked participants to rate the degree of certainty associated with 7 commonly used diagnostic terms found in previous studies. The 7 diagnostic phrases queried are listed in Table 1.

Table 1

Diagnostic Phrases

Diagnostic Phrases
Diagnostic Phrases

Two additional questions were asked. The first question was, “To what extent does a comment allow you to clarify a line diagnosis that is not pathognomonic?” The responses for this question were rated on a qualitative scale. The choices were “not at all,” “somewhat,” “well,” and “very well.” The second question was, “How comfortable are you wording pathology reports?” The responses for this question were rated on a qualitative scale. The choices were “not at all comfortable,” “somewhat uncomfortable,” “neutral,” “somewhat comfortable,” “comfortable,” and “very comfortable.” Eighteen staff pathologists and 12 resident pathologists completed the surveys.

In phase 2 of the study, an anonymous questionnaire was given to staff and nonstaff clinicians at 5 standardized interdisciplinary tumor boards or conferences. The disease or organ focuses of the conferences were genitourinary, melanoma, gynecology, gastrointestinal, and breast. Participation was voluntary and without compensation. Each survey had identical questions that included the first 7 diagnostic phrases previously used in the pathologists' questionnaires. Additionally, 5 scenario questions asked respondents to rate how a tumor board/interdisciplinary team, phone call with a pathologist, face-to-face meeting with a pathologist, email with a pathologist, or text message with a pathologist could clarify a diagnosis. The responses for these questions were rated on a qualitative scale. The choices were “not at all,” “somewhat,” “well,” and “very well.” Three additional questions were asked pertaining to reading a pathology report. Question 1 queried, “How often do you read the comment portion of a pathology report?” Question 2 asked, “Do you find the comment helpful in clarifying the diagnosis?” The responses for these questions were rated on a qualitative scale. The choices were “never, “rarely,” “sometimes,” and “most of the time.” Finally, question 3 asked, “How comfortable are you with reading pathology reports?” The responses were rated on a qualitative scale. The choices were “not at all comfortable,” “somewhat uncomfortable,” “neutral,” “somewhat comfortable,” “comfortable,” and “very comfortable.” All questions had to be answered to complete the survey. Fifty-three staff clinicians and 50 nonstaff (ie, resident or allied health professionals) clinicians completed the questionnaire. Descriptive statistics, including mean, median, minimum, maximum, and standard deviation, were calculated using Microsoft Excel.

The average percentage certainty and range of responses to specific descriptive phrases commonly used in surgical pathology demonstrated a wide range of responses for each of the 7 common free-text comment descriptive phrases for all 4 categories of respondents (Figures 1 through 4). This supports the hypothesis that there is little consensus among staff pathologists, nonstaff (resident) pathologists, staff clinicians, and nonstaff clinicians (trainees and allied health professionals) as to the percentage certainty communicated by various common phrases used in free-text comment areas in surgical pathology reports. For phrases that tend to convey the most certainty, such as “diagnostic of,” there existed a narrower range of disagreement as to their meaning. Words such as “favor,” or phrases such as “suspicious for” that communicate a greater level of uncertainty were more variable in their implication for different individuals. The standard deviation of the percentage the writer attempted to communicate varied greatly (Table 2). The phrase “compatible with” could mean from as low “as 1% certainty” for one staff pathologist to “as high as 100% certainty” for a different staff pathologist (Figure 1). Staff clinicians and nonpathology (ie, clinical) staff (ie, residents or allied health professionals) (Figures 3 and 4) demonstrated a wide range of interpretations for most free-text common phrases. Twenty-three of 53 staff clinicians (43%) felt that email could clarify a diagnosis very well. In contrast, only 14 of 50 nonpathologist resident clinicians (28%) felt email could clarify a diagnosis very well. For some residents, texting was a “not at all” solution (Figures 5 and 6).

Figure 1

Staff pathologists—percentage certainty. Black vertical lines represent range of responses; blue circles, means; y-axis, percentage certainty, x-axis, phrase use in line diagnosis.

Figure 2. Trainee pathologists—percentage certainty. Black vertical lines represent range of responses; blue circles, means; y-axis, percentage certainty, x-axis, phrase use in line diagnosis.

Figure 1

Staff pathologists—percentage certainty. Black vertical lines represent range of responses; blue circles, means; y-axis, percentage certainty, x-axis, phrase use in line diagnosis.

Figure 2. Trainee pathologists—percentage certainty. Black vertical lines represent range of responses; blue circles, means; y-axis, percentage certainty, x-axis, phrase use in line diagnosis.

Close modal
Figure 3

Staff clinicians—percentage certainty. Black vertical lines represent range of responses; blue circles, means; y-axis, percentage certainty, x-axis, phrase use in line diagnosis.

Figure 4. Nonstaff clinicians—percentage certainty. Black vertical lines represent range of responses; blue circles, means; y-axis, percentage certainty, x-axis, phrase use in line diagnosis.

Figure 3

Staff clinicians—percentage certainty. Black vertical lines represent range of responses; blue circles, means; y-axis, percentage certainty, x-axis, phrase use in line diagnosis.

Figure 4. Nonstaff clinicians—percentage certainty. Black vertical lines represent range of responses; blue circles, means; y-axis, percentage certainty, x-axis, phrase use in line diagnosis.

Close modal
Table 2

Standard Deviations of Each Group by Percentage

Standard Deviations of Each Group by Percentage
Standard Deviations of Each Group by Percentage
Figure 5

Staff clinicians—preferred modes for communication. Numbered bars are numbers of respondents.

Figure 6. Nonstaff clinicians—preferred modes for communication. Numbered bars are numbers of respondents.

Figure 5

Staff clinicians—preferred modes for communication. Numbered bars are numbers of respondents.

Figure 6. Nonstaff clinicians—preferred modes for communication. Numbered bars are numbers of respondents.

Close modal

It is noteworthy that 25 of 53 staff clinicians (47%) felt very comfortable reading the comment portion of a pathology report (Figure 7). In contrast, only 4 of 53 nonstaff clinicians (8%) felt very comfortable (Figure 7). Of the attending pathologists surveyed, 9 of 18 (50%) felt very comfortable writing a pathology report (Figure 8). Only 2 of 12 resident pathologists (17%) felt the same (Figure 8). In terms of using the comment portion of a report, 34 of 53 staff clinicians (64%) always read the comment, with 10 of 53 (37%) finding the comment always helpful (Figure 9). In contrast, fewer nonstaff clinicians always read the comment (22 of 50; 44%), and fewer nonstaff clinicians (8 of 53; 15%) always found the comment helpful (Figure 10). Staff pathologists placed value on the comment's ability to clarify a diagnosis (Figure 11). Resident pathologists echoed this sentiment in roughly the same number (Figure 12). Of note, percentages were rounded to the nearest whole number.

Figure 7

Staff and nonstaff clinicians—comfort reading report. Numbered bars are numbers of respondents.

Figure 8. Staff and resident pathologists—comfort reading report. Numbered bars are numbers of respondents.

Figure 7

Staff and nonstaff clinicians—comfort reading report. Numbered bars are numbers of respondents.

Figure 8. Staff and resident pathologists—comfort reading report. Numbered bars are numbers of respondents.

Close modal
Figure 9

Staff pathologists—comfort writing report. Numbered bars are numbers of respondents.

Figure 10. Resident pathologists—comfort writing report. Numbered bars are numbers of respondents.

Figure 9

Staff pathologists—comfort writing report. Numbered bars are numbers of respondents.

Figure 10. Resident pathologists—comfort writing report. Numbered bars are numbers of respondents.

Close modal
Figure 11

Staff clinicians—how often do you read the comment of a pathology report? Numbered bars are numbers of respondents.

Figure 12. Nonstaff clinicians—how often do you read the comment of a pathology report? Numbered bars are numbers of respondents.

Figure 11

Staff clinicians—how often do you read the comment of a pathology report? Numbered bars are numbers of respondents.

Figure 12. Nonstaff clinicians—how often do you read the comment of a pathology report? Numbered bars are numbers of respondents.

Close modal

Pathologists use a variety of phrases to communicate diagnostic certainty, but the understood level of certainty of those phrases varies widely among pathologists and clinicians. A recent study1  demonstrated not only the wide variability among interpretation of diagnostic phrases across level of training and specialty, but also the number of “waffle words” used by pathologists by age group. A previous British study11  showed that there was variability in the intended level of certainty communicated using a 1 to 5 scale for the 13 most common diagnostic phrases used at a British hospital. Another study12  found wide variability in the percentage certainty of 7 diagnostic phrases by clinicians, pathology attendings, pathology residents, and medical students. Pathologists' individual preferences for specific diagnostic phrases were examined in a study of veterinarian pathologists' performances,13  which found 79 unique diagnosis phrases in use. This variability was associated with such factors as the implications of the diagnosis and pathologists' prior experience in diagnostic pathology. Generally, there is variability among the average perceived percentage certainty for nearly any given phrase communicated in a medical report.1  A possible confounding factor is the variability in the recall of the content of reports by clinicians. In one study, 30% of surgeons and surgery residents answered incorrectly when presented with an open-book examination–style questionnaire about the contents of the anatomic pathology reports they had recently seen.14  Our current study demonstrates that a high degree of variability exists in both the uses of diagnostic phrases in surgical pathology reports and the interpretation of the meaning of the report to the reader.

This study represents the first to our knowledge to conclusively show that wide variation in interpretation of diagnostic phrases is preserved across broad geographic and practice regions in large-territory academic medical centers. The understood level of certainty for each phrase varies widely among staff pathologists, staff clinicians, nonstaff clinicians (ie, resident clinicians and allied health professionals), and pathology residents. It is likely, from our survey results, that a gap can exist between a pathologist's intended percentage certainty to be communicated via free-text comments and a clinician's interpretation of the free-text comments in a surgical pathology report. Although the means for each diagnostic phrase are not hugely different in comparing each group, the degree of variation is striking and potentially problematic. For phrases such as “suspicious for” or “cannot rule out,” arguably when the clearest communication is necessary for a difficult case, some of the greatest ambiguity in diagnostic phrases' meaning and variation is introduced. Similar to previous studies, our results suggest pathologists' and clinicians' interpretations of phrases can vary widely, even when the descriptive terms are clearly communicated by free-text comments.1,9,12 

The average level of certainty for staff pathologists and pathology residents for each phrase is approximately the same for many free-text comment words and phrases. For example, for the phrase “suggestive of,” staff pathologists average 59% certainty and pathology residents average 67% certainty. However, differences in ranges of percentage certainty were revealing. For example, pathology residents felt “suggestive of” could mean that the surgical pathologist's level of certainty fell in a 48% to 80% range. In contrast, staff pathologists indicated “suggestive of” could mean as low as 1% certainty or as high as 90% certainty. The ranges of responses for staff pathologists tended to be broader for all of the terms examined in this study (Figures 1 and 2). This supports the hypothesis that the ranges of variability for percentage certainty for surgical pathology diagnoses expressed by residents have the potential to be dramatically altered during training as pathology residents learn which diagnostic phrases to use, in what context, and additional nuances of the uses of the terms. Studies of larger numbers of residents, in each year of training, are needed to determine if residents' percentage certainty is actually influenced by years of training.

Similar to the large differences in results comparing ranges of percentage certainty for pathology residents and staff as communicated by common phrases used in free-text comments in surgical pathology reports, the clinical staff and clinical nonstaff appeared to have analogous differences in the ranges of their responses to queries on their interpretations of selected phrases in terms of percentage certainty (Figures 3 and 4).

Clinicians appeared to have clear preferences in the way they communicate with a pathologist (Figures 5 and 6). Staff clinicians and residents generally agreed that tumor boards/interdisciplinary conferences, face-to-face meetings, and phone calls were the best ways to clarify a diagnosis. A sizeable portion of nonstaff clinician respondents reported email or text message to be somewhat or not at all beneficial at clarifying a diagnosis. Interestingly, this opinion was not shared by staff clinicians, the majority of whom maintained that email or text message could clarify a diagnosis well or very well.

Our data show that the majority of staff clinicians are very comfortable reading a pathology report (Figure 7). Again, this gap identifies the need for more formal training in reading and writing reports. As in previous studies,9  clinicians appeared to not always read the comment portion of a pathology report (Figures 9 and 10). This has potentially serious implications, and many pathologists rely on the comment portion of the report to communicate important diagnostic information. However, as previous work has shown, a comment contextualizing the diagnosis, its level of certainty, and other differential diagnosis possibilities may be insufficient to overcome recall inaccuracy, let alone variability in the uses of diagnostic phraseology.14  Staff and resident pathologists continue to report high confidence in the ability of the diagnostic comment to clarify a diagnosis.9  Interestingly, although no free-response field was in the surveys, numerous clinicians wrote such comments in the margins. For example, a clinician wrote; “I find the use of ‘suggestive of,' ‘suspicious for,' ‘compatible with' and ‘cannot rule out' problematic and upsetting to patients and difficult to explain.” This is a poignant example of not only the challenges faced by clinicians in interpreting pathology reports but also subsequent explanations to patients and their families.

There are limitations of the study, which we acknowledge. Tumor boards at the 3 different institutions are not homogeneously structured, nor do they occur at the same interval. Tumor boards generally include the most complex cases, which may represent more complex and ambiguous cases than average pathology reports. Each of the 3 institutions is a large tertiary academic medical center. It is possible these results vary in private hospital systems. It is plausible, as found in one study13  but not replicated in another,14  that experience may influence both the mean percentage certainty and the variability in responses expressed among peers. No effort was made in this study to account for age or expertise of the respondents, instead using their level of training (staff versus nonstaff) as a surrogate marker. Furthermore, nonstaff clinicians included a mix of trainee clinicians and allied health professionals. This was done for pragmatic reasons, but the 2 groups had large differences in their training backgrounds. On the other hand, both were likely equally unexposed to pathology reports in the clinical setting. There is a much smaller sample size of pathologists compared with clinicians, which could mean additional perspectives were missed. Certainly one explanation is that a smaller number of pathologists attend tumor boards compared with clinicians. Further subgroup analysis by institution or clinician specialty could further yield important differences in the specific discrepancies between pathologist and clinician interpretation.

The differences found among pathologists and clinicians do not imply that one group is right or that resident understanding is lacking. The College of American Pathologists and other advisory bodies have yet to standardize diagnostic phrasing. An individual's training, personality, proficiency in the English language, location, and coworkers are all factors that might affect one's understanding and use of diagnostic phrases.

Interestingly, few respondents felt that one could be 100% certain of a pathologic diagnosis. Although this is an intriguing philosophical point, there is a practical legal aspect to this belief. The fear of the legal ramifications of overcommunicating or undercommunicating a diagnosis may affect a pathologist's interpretation and use of diagnostic phrasing. Indeed, expressing uncertainty out of extreme caution when none is warranted dilutes the value of the diagnostic phrase.

Surveys can demonstrate the wide range in understanding for each diagnostic phrase, as confirmed in this study. This problem might be overcome by writing a key conveying the relative level of certainty each phrase might have (for example, “diagnostic of” > “compatible with” > “suspicious for”) This way, the examining pathologist and the interpreting clinician could quickly reference what each diagnostic phrase means in terms of the quality or approximate certainty it conveys. On the other hand, if additional physician-pathologist communication is often needed to clarify the pathologist's intended meaning in free-text comments, perhaps more descriptive free-text comments are needed as the standard of care.

To establish such a key, national organizations could reach a consensus on the percentage certainty a given diagnostic phrase can denote. This could help standardized pathology reports. This might make the words used in line diagnosis more uniform and more clearly understood by the reader of the report. Local departments of pathology, or private practices, could agree to local standards of phrasing and educate clinical colleagues as to intended meanings of phrases. Further work can also be done to consider the patient as a reader of the report and efforts undertaken to aid in the patient's comprehension. More formal trainee training on writing pathology reports would hopefully aid pathology residents at becoming more adept in their phraseology and using more specific words appropriate to specific situations. This would not only increase their accuracy in communicating results but also stimulate educational opportunities to discuss why a specific histopathology meets, or fails to meet, certain thresholds needed for definitive diagnoses. Along these lines, formal training efforts could be established to educate nonstaff clinicians on best practices when reading a pathology report. These efforts, taken together, could greatly enhance the accuracy and use of pathology reports in modern medicine.

1.
Lindley
SW,
Gillies
EM,
Hassell
LA.
Communicating diagnostic uncertainty in surgical pathology reports: disparities between sender and receiver
.
Pathol Res Pract
.
2014
;
210
(10)
:
628
633
.
2.
Prabhu
AV,
Kim
C,
Crihalmeanu
T,
et al
An online readability analysis of pathology-related patient education articles: an opportunity for pathologists to educate patients
.
Hum Pathol
.
2017
;
65
:
15
20
.
3.
Khorasani
R,
Bates
DW,
Teeger
S,
Rothschild
JM,
Adams
DF,
Seltzer
SE.
Is terminology used effectively to convey diagnostic certainty in radiology reports?
Acad Radiol
.
2003
;
10
(6)
:
685
688
.
4.
Clinger
NJ,
Hunter
TB,
Hillman
BJ.
Radiology reporting: attitudes of referring physicians
.
Radiology
.
1988
;
169
(3)
:
825
826
.
5.
Sobel
JL,
Pearson
ML,
Gross
K,
et al
Information content and clarity of radiologists' reports for chest radiography
.
Acad Radiol
.
1996
;
3
(9)
:
709
717
.
6.
Bastuji-Garin
S,
Schaeffer
A,
Wolkenstein
P,
et al
Pulmonary embolism; lung scanning interpretation: about words
.
Chest
.
1998
;
114
(6)
:
1551
1555
.
7.
Wyatt
RJ,
Julian
BA,
Galla
JH.
Properdin deficiency with IgA nephropathy
.
N Engl J Med
.
1981
;
305
(18)
:
1097
.
8.
Toogood
JH.
What do we mean by “usually”?
Lancet
.
1980
;
1
(8177)
:
1094
.
9.
Bracamonte
E,
Gibson
BA,
Klein
R,
Krupinski
EA,
Weinstein
RS.
Communicating uncertainty in surgical pathology reports: a survey of staff physicians and residents at an academic medical center
.
Acad Pathol
.
2016
;
3
:
2374289516659079
.
10.
Ducatman
BS,
Hashmi
M,
Darrow
M,
Flanagan
MB,
Courtney
P,
Ducatman
AM.
Use of pathology data to improve high-value treatment of cervical neoplasia
.
Acad Pathol
.
2016
;
3
:
2374289516679849
.
11.
Attanoos
RL,
Bull
AD,
Douglas-Jones
AG,
Fligelstone
LJ,
Semararo
D.
Phraseology in pathology reports: a comparative study of interpretation among pathologists and surgeons
.
J Clin Pathol
.
1996
;
49
(1)
:
79
81
.
12.
Galloway
M,
Taiyeb
T.
The interpretation of phrases used to describe uncertainty in pathology reports
.
Pathol Res Int
.
2011
;
2011
:
656079
.
13.
Christopher
MM,
Hotz
CS.
Cytologic diagnosis: expression of probability by clinical pathologists
.
Vet Clin Pathol
.
2004
;
33
(2)
:
84
95
.
14.
Powsner
SM,
Costa
J,
Homer
RJ.
Clinicians are from Mars and pathologists are from Venus
.
Arch Pathol Lab Med
.
2000
;
124
(7)
:
1040
1046
.

Author notes

The authors have no relevant financial interest in the products or companies described in this article.

Presented at the United States and Canadian Academy of Pathology Annual Meeting; March 19, 2018; Vancouver, British Columbia, Canada.