Context.—Cancer Care Ontario implemented synoptic pathology reporting across Ontario, impacting the practice of pathologists, surgeons, and medical and radiation oncologists. The benefits of standardized synoptic pathology reporting include enhanced completeness and improved consistency in comparison with narrative reports, with reported challenges including increased workload and report turnaround time.

Objective.—To determine the impact of synoptic pathology reporting on physician satisfaction specific to practice and process.

Design.—A descriptive, cross-sectional design was utilized involving 970 clinicians across 27 hospitals. An 11-item survey was developed to obtain information regarding timeliness, completeness, clarity, and usability. Open-ended questions were also employed to obtain qualitative comments.

Results.—A 51% response rate was obtained, with descriptive statistics reporting that physicians perceive synoptic reports as significantly better than narrative reports. Correlation analysis revealed a moderately strong, positive relationship between respondents' perceptions of overall satisfaction with the level of information provided and perceptions of completeness for clinical decision making (r = 0.750, P < .001) and ease of finding information for clinical decision making (r = 0.663, P < .001). Dependent t tests showed a statistically significant difference in the satisfaction scores of pathologists and oncologists (t169 = 3.044, P = .003). Qualitative comments revealed technology-related issues as the most frequently cited factor impacting timeliness of report completion.

Conclusion.—This study provides evidence of strong physician satisfaction with synoptic cancer pathology reporting as a clinical decision support tool in the diagnosis, prognosis, and treatment of cancer patients.

In 2004, Cancer Care Ontario initiated a pathology reporting project aimed at improving the quality of cancer pathology by standardizing the content, format, and transmission of reports to the Ontario Cancer Registry. This large-scale, change-management project involved more than 400 Ontario pathologists and 100 hospitals producing more than 70 000 cancer pathology reports annually from a population of 12.9 million. Structured pathology reporting was implemented based on the nationally and provincially endorsed College of American Pathologists cancer pathology checklists, utilizing innovative electronic tools in hospital laboratory information systems linked by an electronic pathology network.

Phase 1 of the project focused on the implementation of synoptic pathology reporting for resections pertaining to the 5 main disease site groups: lung, colorectal, prostate, endometrium, and breast. This accounted for approximately 70% of all surgical pathology reports and involved 33 hospitals across Ontario.

High-quality, complete cancer pathology reports describing diagnostic, prognostic, and predictive elements are required for contemporary oncologic practice. Several studies have documented the benefits of structured synoptic cancer pathology reports, including the elimination of missing information,14  increased completeness and accessibility of information,59  improved information to support clinical decision making and for research purposes,3,1013  and increased clinician satisfaction.1417 

The purpose of this study was to determine the impact of standardized synoptic pathology reporting on physician satisfaction regarding process (eg, timeliness and completeness) and practice (eg, clinical decision making). This paper will report the results of a physician satisfaction survey conducted after phase 1 implementation of synoptic pathology reporting.

A quasi-experimental, cross-sectional, descriptive design was used. The focus of the program evaluation was to determine the impact of implementation of standardized synoptic pathology reporting on physician satisfaction in areas such as process (eg, timeliness of reports, completeness, need for follow-up) and practice (eg, information to support clinical decision making).

Sample

The inclusion criterion for this evaluation study was for a hospital to have implemented synoptic pathology reporting by March 31, 2010, resulting in a total of 33 hospitals available for inclusion in the study. The primary targeted population for the study included pathologists, as those responsible for the generation of pathology reports, and treating physicians (ie, surgeons, medical oncologists, and radiation oncologists) as the end users of the pathology reports. Because the initial phase of synoptic reporting was focused on 5 of the College of American Pathologists checklists (eg, lung, breast, colorectal, prostate, and endometrium), physicians involved in treating these 5 cancers were the primary focus. Contact information (e-mail and or fax numbers) for the identified stakeholder groups was requested and obtained from 27 of the 33 identified hospitals. This resulted in a final sample consisting of a total of 970 physicians, representing pathologists (n = 252), surgeons (n = 462), medical oncologists (n = 128), and radiation oncologists (n = 128).

Survey Design

The survey items were developed with a desire to obtain information related to physician perceptions of timeliness, completeness, usability, and accuracy of the reports as well as overall user satisfaction. Two surveys were developed, with one targeted for pathologists and the other for treating physicians (ie, surgeons, medical oncologists, and radiation oncologists). The items on each survey were consistent on both versions, with the exception of one item on each version that was specific to each stakeholder group (ie, pathologists and treating physicians). The initial items were created by members of the evaluation team and were then reviewed for face and content validity by 8 content experts, including members of the pathology reporting team, radiation oncologists, medical oncologists, and surgeons. The resulting 11-item survey asked participants to answer the items by comparing standardized synoptic reporting with narrative reporting methods using a 5-point Likert scale (1 = significantly less than narrative reports, 3 = the same as narrative reports, 5 = significantly better than narrative reports). The survey was available in a Web-based or hard-copy format. To enhance response rates, all participants who completed the survey were provided the option to enter their name for 1 of 4 cash drawings of $1000 Canadian. A systematic review regarding methods for improving physician response rates18  identified that survey length, mode (Web-based, mail, or phone), and the frequency of reminders were key factors to consider when developing a survey process involving physicians. The use of financial incentives was also shown to increase response rates.19  As this evaluation was undertaken for the purposes of program review quality assurance, ethics review was not required, as stipulated under Article 2.5 of the Canadian Tri-Council Policy Statement on the Ethical Conduct for Research Involving Humans.20  Confidentiality of individual responses was assured. Participants had the option of providing their name only if they wished to be included in the cash drawing and/or if they did not wish to receive future reminders or communications regarding the survey.

Data Collection

Data collection followed the Tailored Design Method as described by Dillman.21  For physicians with e-mail contact information, all communication regarding the survey was sent via e-mail, which included the hyperlink to the Web-based survey. For those without e-mail, all communications, including the survey, were sent by fax to the clinicians' office. A secure fax number was provided for the return of completed paper surveys. Reminder notices were sent out every 2 weeks and targeted only at those who had not yet returned the survey (based on the names provided on returned surveys as described above). Survey results were entered in SPSS version 16.0 (SPSS Inc., Chicago, Illinois) for data analysis.

Respondent Demographics

A total of 498 surveys were completed, representing a 51% response rate. The response rate by specialty ranged from 39% (surgeons) to 68% (pathologists), with 45% and 55% response rates from medical oncologists and radiation oncologists, respectively. The respondents were from a variety of practice settings, with the majority (41%) indicating their primary practice setting as a teaching hospital affiliated with a regional cancer center. Forty-three percent indicated they had 10 to 20 years of experience in practice, followed by those with 21 to 30 years (27%) and those with less than 10 years (25%). When asked to describe the average number of cancer resection pathology reports completed each month, the majority of pathologist respondents (43%) indicated they completed between 10 and 25 reports, on average, a month. This was similar to the amount of pathology reports reviewed by the majority of clinicians (43%). Prior to the implementation of standardized synoptic reporting, the most common method of pathology reporting used was narrative reporting (74%), with 19% indicating the use of some form of electronic or synoptic-like pathology reporting in place. See Table 1 for respondent demographics.

Table 1.

Respondent Demographics

Respondent Demographics
Respondent Demographics

Results of descriptive statistics revealed that based on a 5-point Likert scale, with 5 indicating synoptic reports are significantly better than narrative reports, the vast majority of physicians who responded reported that the standardized synoptic pathology reports were significantly better than narrative reports for all items, with mean scores ranging from 3.84 to 4.77. Table 2 provides the mean and standard deviation scores for survey items based on the responses for pathologists and treating physicians (eg, surgeons, medical oncologists, and radiation oncologists), with “na” indicating questions that were not included on the survey and therefore did not have responses for the particular stakeholder group.

Table 2.

Mean (SD) Scores for Survey Items

Mean (SD) Scores for Survey Items
Mean (SD) Scores for Survey Items

Regarding the amount of time to produce and receive pathology reports, participants were again provided with a 5-point Likert scale (1 = significantly less than narrative reports, 3 = about the same, and 5 = significantly more than narrative reports). Pathologists reported that the time to produce the reports was more than that of narrative reports (mean, 3.51; SD = 1.43), and similarly, treating physicians also reported a slight increase in time to receive the reports (mean, 3.41; SD = 1.16). The survey provided the option of then indicating the approximate percentage of time (eg, more or less) required. To further understand the impact of synoptic reporting on perceptions of work flow, a dichotomous variable was created and responses recoded based on more or less response options. Results indicated that those pathologists who indicated synoptic pathology reporting required more time reported that synoptic pathology reporting required 25% to 50% more time to complete.

Qualitative comments that accompanied these responses indicated that the major contributor to the increase in time was associated with technology-related factors such as software glitches, upgrades, and information system–related issues. For the treating physician group, the vast majority (60%) reported that the amount of time required to obtain the final pathology report was “about the same as” narrative reports. So, although the pathologists reported that synoptic reports took longer than narrative reports, the end users of the reports did not perceive a difference in the time required to obtain the report.

In terms of overall satisfaction, both groups reported synoptic pathology reports as being better than narrative reports, with treating physicians reporting higher levels of satisfaction with the overall process (mean, 4.52; SD = 0.991) and the level of information provided (mean, 4.85; SD = 0.901) as compared with pathologists' satisfaction with the process (mean, 4.08; SD = 1.34) and level of information provided (mean, 4.08; SD = 1.44).

In order to determine the factors that may have contributed to satisfaction, a correlation analysis was conducted. Results showed a moderately strong positive relationship between respondents' perceptions of overall satisfaction with the level of information provided in synoptic reports and respondents' perceptions of the completeness of the reports for clinical decision making (r = 0.750, n = 313, P < .001), comparison with accepted content standards (r = 0.692, n = 313, P < .001), ease of finding information for clinical decision making (clinicians: r = 0.663, n = 314, P = .001; pathologists: r = 0.510, n = 171, P < .001), and the report's ability to facilitate a consistent approach to diagnostic and prognostic factors (clinicians: r = 0.717, n = 312, P < .001; pathologists: r = 0.638, n = 168, P < .001) (see Table 3).

Table 3.

Correlation Analysis of Overall Satisfaction With the Information Provided in Standardized Synoptic Pathology Reportsa

Correlation Analysis of Overall Satisfaction With the Information Provided in Standardized Synoptic Pathology Reportsa
Correlation Analysis of Overall Satisfaction With the Information Provided in Standardized Synoptic Pathology Reportsa

Dependent t tests were conducted to compare the differences in the mean scores of pathologists' and treating physicians' perceptions of overall satisfaction, indicating a statistically significant difference in scores for overall satisfaction with the synoptic reporting process (t169 = 3.044, P = .003). This result is consistent with the frequency distribution of pathologists' responses, with greater variation in responses than their physician counterparts. As the implementation of synoptic pathology reporting had a direct impact on pathologist practice and work flow, it is not surprising that pathologists' overall level of satisfaction would be different from that of their colleagues.

Analysis of variance was conducted to compare mean satisfaction scores for pathologists, surgeons, and medical oncologists, based on demographic variables such as practice setting, years of experience, number of pathology reports, and/or previous method of pathology reporting prior to implementation of synoptic pathology reporting, with no statistically significant differences found in the overall satisfaction scores based on any of the demographic variables.

Qualitative Analysis

The qualitative data were thematically summarized using conventional content analysis methodology.22  Upon review of the qualitative comments provided, 2 main themes emerged: practice-related and process-related.

Practice-related comments referred to the improvement of the information available (eg, “allows me to find the information I need quickly and efficiently;” “by having a common language, this improves the efficiency of patient management”) and error reduction (eg, “synoptic format reduces the chance for error or forgetting to include a specific parameter that is significant for a cancer case”). Practice-related issues also identified concern the assumption that completeness equates to accuracy (eg, “the main issue with synoptic reporting is that completeness can mask accuracy”; “synoptic reports are a form of presentation but do not necessarily mean that the data is more accurate”).

Process-related comments were centered on technology and usability of the reports because of software requirements/restrictions (eg, “less convenient mostly due to software formatting limitations”; “the rigidity of the choices do not necessarily reflect the complexity of the case we examine”).

The adoption of a common electronic reporting standard by pathologists and hospitals in Ontario is unprecedented for a jurisdiction of this size and complexity. The achievement of 92% of hospitals reporting at level 6 and a 94% completeness rate against the College of American Pathologists cancer checklist standard is quite significant when compared with a recent study using College of American Pathologists Q-Probes data that indicated that the completeness rate, in the 86 American institutions included in the study, was 68%.1  As is evident in the results of this study, physician satisfaction with standardized pathology reports is high when the information required to support diagnostic and prognostic decision making is readily available, relevant, and timely. Technologic considerations in the design and usability of standardized reports are vital to enabling access to information of importance to clinicians and minimizing the impacts on work flow and workload.

The authors would like to acknowledge Jody Whitfield, research assistant, for her invaluable assistance with the data collection phase of this evaluation study.

1
Idowu
M
,
Bekeris
L
,
Raah
S
,
Ruby
S
,
Nakhleh
R
.
Adequacy of surgical pathology reporting of cancer: a College of American Pathologists Q-Probes study of 86 institutions
.
Arch Pathol Lab Med
.
2010
;
134
(
7
):
969
974
.
2
Hammond
E
,
Flinner
R
.
Clinically relevant breast cancer reporting: using process measures to improve anatomic pathology reporting
.
Arch Pathol Lab Med
.
1997
;
121
(
11
):
1171
1175
.
3
Urquhart
R
,
Grunfeld
E
,
Porter
G
.
Synoptic reporting and quality of cancer care
.
Oncol Exch
.
2009
;
8
(
1
):
28
31
.
4
Harvey
A
,
Zhang
H
,
Nixon
J
,
Brown
C
.
Comparison of data extraction from standardized versus traditional narrative operative reports for database related research and quality control
.
Surgery
.
2007
;
141
(
6
):
708
714
.
5
Srigley
J
,
McGowan
T
,
MacClean
A
,
et al
.
Standardized synoptic cancer pathology reporting: a population based approach
.
J Surg Oncol
.
2009
;
99
(
8
):
517
524
.
6
Gill
A
,
Johns
A
,
Eckstein
R
,
et al
.
Synoptic reporting improves histopathological assessment of pancreatic resection specimens
.
R Coll Pathol Australas
.
2009
;
41
(
2
):
161
167
.
7
Karim
R
,
van den Berg
K
,
Colman
,
M
,
McCarthy
S
,
Thompson
J
.
The advantages of using a synoptic pathology report format for cutaneous melanoma
.
Histopathology
.
2008
;
52
(
2
):
130
138
.
8
Cross
SS
,
Feeley
KM
,
Angel
CA
.
The effect of four interventions on the informational content of histopathology reports of resected colorectal carcinomas
.
J Clin Pathol
.
1998
;
51
(
6
):
481
482
.
9
Hatzidis
WT
,
Solomon
MJ
,
Schnitzler
M
,
Cartmill
J
,
Loder
P
,
Chapuis
P
.
Does the caseload of the pathologist influence the minimum and extended data set of pathology variables reported in rectal adenocarcinoma?
Colorectal Dis
.
2000
;
2
(
1
):
26
30
.
10
Zarbo
R
.
Interinstitutional assessment of colorectal carcinoma surgical pathology report adequacy: a College of American Pathologists Q-Probes study of practice patterns from 532 laboratories and 15940 reports
.
Arch Pathol Lab Med
.
1992
;
116
(
11
):
1113
1119
.
11
Schmidt
RA
.
Synopses, systems, and synergism
.
Am J Clin Pathol
.
2007
;
127
(
6
):
845
847
.
12
Kang
HP
,
Devine
LJ
,
Picolli
AL
,
Seethala
RR
,
Amin
W
,
Parwani
AV
.
Usefulness of a synoptic data tool for reporting of head and neck neoplasms based on the College of American Pathologists cancer checklists
.
Am J Clin Pathol
.
2009
;
132
(
4
):
521
530
.
13
Hemmings
C
,
Jeffrey
M
,
Frizelle
F
.
Changes in the pathology reporting of rectal cancer: is it time to adopt synoptic reporting?
N Z Med J
.
2003
;
116
(
1178
):
1
3
14
Zarbo
R
.
Determining customer satisfaction in anatomic pathology
.
Arch Pathol Lab Med
.
2006
;
130
(
6
):
645
649
.
15
Zarbo
R
,
Nakhleh
R
,
Walsh
M
.
Customer satisfaction in anatomic pathology: a College of American Pathologists Q-Probes study of 3065 physician surveys from 94 laboratories
.
Arch Pathol Lab Med
.
2003
;
127
(
1
):
23
29
.
16
Yunker
WK
,
Matthews
TW
,
Dort
,
JC
.
Making the most of your pathology: standardized histopathology reporting in head and neck cancer
.
J Otolaryngol Head Neck Surg
.
2008
;
37
(
1
):
48
55
.
17
Nakhleh
R
,
Souers
R
,
Ruby
S
.
Physician satisfaction with surgical pathology reports: a 2-year College of American Pathologists Q-Tracks study
.
Arch Pathol Lab Med
.
2008
;
132
(
11
):
1719
1722
.
18
VanGeest
J
,
Johnson
T
,
Welch
V
.
Methodologies for improving rates in surveys of physicians: a systematic review
.
Eval Health Prof
.
2007
;
30
(
4
):
303
321
.
19
Keating
N
,
Zaslavsky
A
,
Goldstein
J
,
Ayanian
J
.
2008. Randomized trial of $20 versus $50 incentives to increase physician survey responses
.
Med Care
.
2008
;
46
(
8
):
878
881
.
20
Tri-Council policy statement: ethical conduct for research involving humans
. .
Published
2009
.
Accessed March 31, 2010
.
21
Dillman
D
.
Mail and Internet Surveys: The Tailored Design Method. 2nd ed
.
Hoboken, New Jersey
:
John Wiley & Sons Inc;
2007
.
22
Hsieh
H-F
,
Shannon
S
.
Three approaches to qualitative content analysis
.
Qual Health Res
.
2005
;
15
(
9
):
1277
1288
.

Author notes

The authors have no relevant financial interest in the products or companies described in this article.