To the Editor.—We read with interest the latest publication from the College of American Pathologists (CAP) on their Electronic Cancer Checklists (eCC) “Exploring the College of American Pathologists Electronic Cancer Checklists: What they are and what they can do for you.”1 This follows previous similar publications from the CAP.2 Because there are alternatives to the eCC,3,4 what exactly can the eCC do for me?
This article summarizes multiple features that are of benefit to pathologists creating synoptic reports, which vendors may or may not choose to implement as part of their version of the eCC. Vendors are not required to implement any of these features, the incentives for them to implement these features are not clear, and not all vendors choose to implement them. Some features may require considerable additional investment. Some features the authors describe are not currently offered as part of the eCC. As a result, this appears to be a list of what the CAP hopes the eCC will someday become rather than what is available today.
Fortunately, there are other ways to implement a synoptic report product than the one currently used for the eCC. The use of a Web site to create synoptic reports has been widely tested and validated in the literature,3,4 and virtually all of the “advanced” features the current article describes have already been field validated and routinely used in practice since 2017. From this experience, we have learned several things that are not highlighted in this review. First, although we initially implemented “input validation” for numeric values, we subsequently removed it because the pathologists did not want to be limited to numeric values for their responses. Indeed, subsequent experience has shown that the nonnumeric portion of this response can be of substantial benefit in improving the accuracy of the reported result.5,6 In addition, long-term collection of amendment data did not show any significant increase in amendments after this input validation was removed (Renshaw, unpublished data, June 2020). This suggests that the actual benefit of input validation of numeric values may be limited, especially if pathologists think they can improve the accuracy of their responses using an input that does not fit the validation scheme. In addition, it shows the need for actual field validation rather than simple technical software validation to determine the benefits of proposed “improvements.”
Second, there is a long list of other features that can benefit pathologists including, but not limited to, custom questions, immunohistochemical support,3 and ancillary testing, just to name a few. Why these are not discussed in the article is not clear. As an example, there was a considerable delay between the publication of the most recent International Federation of Gynecology and Obstetrics (FIGO) cervical cancer staging criteria and its subsequent incorporation into the CAP cancer checklists. To bridge this gap, we implemented a custom question to allow our clinicians to use the most up-to-date FIGO staging system without having to wait for the CAP.
We believe the successful implementation of a synoptic report product requires substantial data collection including not only software validation but also field validation to ensure that improvements to the product actually work as intended. To achieve the vision that the CAP has provided for synoptic reporting and the eCC, more active engagement by the CAP with the eCC vendors is needed to ensure the collection of those data.
The authors have no relevant financial interest in the products or companies described in this article.