In the United States, review of digital whole slide images (WSIs) using specific systems is approved for primary diagnosis but has not been implemented for intraoperative consultation.


To evaluate the safety of review of WSIs and compare the efficiency of review of WSIs and glass slides (GSs) for intraoperative consultation.


Ninety-one cases previously submitted for frozen section evaluation were randomly selected from 8 different anatomic pathology subspecialties. GSs from these cases were scanned on a Leica Aperio AT2 scanner at ×20 magnification (0.25 μm/pixel). The slides were deidentified, and a short relevant clinical history was provided for each slide. Nine board-certified general pathologists who do not routinely establish primary diagnoses using WSIs reviewed the WSIs using Leica Aperio ImageScope viewing software. After a washout period of 2–3 weeks, the pathologists reviewed the corresponding GSs using a light microscope (Olympus BX43). The pathologists recorded the diagnosis and time to reach the diagnosis. Intraobserver concordance, time to diagnosis, and specificity and sensitivity compared to the original diagnosis were evaluated.


The rate of intraobserver concordance between GS results and WSI results was 93.7%. Mean time to diagnosis was 1.25 minutes for GSs and 1.76 minutes for WSIs (P < .001). Specificity was 91% for GSs and 90% for WSIs; sensitivity was 92% for GSs and 92% for WSIs.


Time to diagnosis was longer with WSIs than with GSs, and scanning GSs and uploading the data to whole slide imaging systems takes time. However, review of WSIs appears to be a safe alternative to review of GSs. Use of WSIs allows reporting from a remote site during a public health emergency such as the COVID-19 pandemic and facilitates subspecialty histopathology services.

This content is only available as a PDF.

Author notes

Aung contributed equally to this manuscript.

Alqaidy is currently located in the Department of Pathology at King Abdulaziz University in Jeddah, Saudi Arabia.

Aung is supported by funding in computational oncology by the Joint Center in Computational Oncology, led by the Oden Institute for Computational Engineering and Sciences, MD Anderson Cancer Center, and the Texas Advanced Computing Center.

Competing Interests

The authors have no relevant financial interest in the products or companies described in this article.

Portions of the findings of this study were presented at the Annual Meeting of the College of American Pathologists as a poster; Monday, October 10, 2022; New Orleans, Louisiana.