Context.—Gynecologic cytopathology is a heavily regulated field, with Clinical Laboratory Improvement Amendments of 1988 mandating the collection of many quality metrics. There is a lack of consensus regarding methods to collect, monitor, and benchmark these data and how these data should be used in a quality assurance program. Furthermore, the introduction of human papilloma virus testing and proficiency testing has provided more data to monitor.
Objective.—To determine good laboratory practices in quality assurance of gynecologic cytopathology.
Data Sources.—Data were collected through a written survey consisting of 98 questions submitted to 1245 Clinical Laboratory Improvement Amendments–licensed or Department of Defense laboratories. There were 541 usable responses. Additional input was sought through a Web posting of results and questions on the College of American Pathologists Web site. Four senior authors who authored the survey and 28 cytopathologists and cytotechnologists were assigned to 5 working groups to analyze data and present statements on good laboratory practices in gynecologic cytopathology at the College of American Pathologists Gynecologic Cytopathology Quality Consensus Conference. Ninety-eight attendees at the College of American Pathologists Gynecologic Cytopathology Quality Consensus Conference discussed and voted on good laboratory practice statements to obtain consensus.
Conclusions.—This paper describes the rationale, background, process, and strengths and limitations of a series of papers that summarize good laboratory practice statements in quality assurance in gynecologic cytopathology.
The Clinical Laboratory Improvement Amendments of 1988 regulations mandated that a variety of quality metrics be collected within gynecologic cytology laboratories.1 Twenty years after publication of the final rule, there is still no agreement on what quality metrics can or should be used to identify deviations in quality, if and what benchmarks should be used to track quality metrics, and what actions should be taken once a problem is identified. Since the implementation of the Clinical Laboratory Improvement Amendments of 1988, cytology practice has changed dramatically. Proficiency testing of individuals has been enacted to measure the ability of laboratory professionals to correctly diagnose Papanicolaou (Pap) tests.2 The replacement of conventional Pap tests with liquid-based specimens has facilitated testing for high-risk human papillomavirus infection, which is widely accepted as a means to triage atypical cytopathology cases in more recent years.3–14 Both proficiency testing and human papillomavirus testing potentially bring many more data points to a cytopathology quality assurance (QA) program. However, it is not known how these metrics are being used in cytopathology as a part of a laboratory's overall QA program.
The College of American Pathologists (CAP) was awarded a cooperative agreement by the Centers for Disease Control and Prevention to develop an inventory of current practices in gynecologic cytology laboratories to attempt to standardize procedures for quality improvement. This was done through a multistep process to determine consensus practices in QA for gynecologic cytology. The goal was to identify what quality metrics are collected, how metrics are analyzed, what benchmarks are used to determine variance in performance, and what actions are taken to address performance issues.
MATERIALS AND METHODS
In brief, the steps consisted of (1) the development of a survey of QA practices sent to all cytology laboratories in the United States that participate in gynecologic cytopathology proficiency testing; (2) the formation of working groups to analyze responses; (3) the posting of additional questions raised by each work group's analysis of the data on the CAP Web site, which was open to comments from the cytopathology community; and (4) convening a consensus conference, also open to the cytopathology community, to determine consensus practice patterns.
SURVEY
The authors, 3 of whom are cytopathologists with expertise in cytology QA programs (J.A.T., M.R.H., B.A.J.), developed the survey. The survey was 29 pages long and consisted of 99 questions. All responses were delinked or deidentified and the data were analyzed in aggregate to maintain anonymity. In addition to general laboratory demographics and information, the survey queried laboratories about QA practices and policies across 9 broad categories: (1) monitoring of diagnostic rates; (2) rescreen of cases initially screened as negative for intraepithelial lesion or malignancy prior to reporting; (3) monitoring of retrospective look backs of negative for intraepithelial lesion or malignancy Pap tests prompted by a current Pap test diagnosed as high-grade squamous intraepithelial lesion or higher; (4) proficiency testing; (5) monitoring of Pap test and cervical biopsy correlation; (6) monitoring concurrence of cytotechnologist and pathologist diagnoses at time of initial sign-out by pathologist; (7) monitoring human papillomavirus rates; (8) turnaround time; and (9) general quality.
The survey was piloted to 10 laboratories. The results of this pilot study were presented in aggregate form at the 58th annual scientific meeting of the American Society of Cytopathology, November 12–16, 2010. Stakeholder organizations were provided an opportunity to comment on the survey prior to finalizing it. These organizations included the CAP Cytopathology Resource Committee, the American Society for Cytopathology, the American Society for Cytotechnology, and the American Society of Clinical Pathology, as well as Centers for Disease Control and Prevention colleagues from the Division of Cancer Prevention and Control and the Laboratory Science, Policy and Practice Program Office. The survey was submitted to 1245 laboratories. Of these laboratories, 1201 were identified from a Centers for Disease Control and Prevention list of Clinical Laboratory Improvement Amendments–registered cytopathology laboratories, and 44 were identified from enrollment in the CAP gynecologic proficiency testing program, as they were Department of Defense–affiliated laboratories. Out of the 596 laboratories that responded, data could be used from only 541 laboratories, as 55 laboratories had incomplete survey responses. This is a 43% response rate that had usable data. The largest proportion (44.1%) of the 541 responding laboratories was voluntary, nonprofit hospital based (Table 1). The median number of cytotechnologists per laboratory was 3, and the median number of pathologists per laboratory was 4 (Table 2). The median number of Pap tests per laboratory was 10 904, with a range of 2 to 1 306 014. ThinPrep (Hologic, Inc, Bedford, Massachusetts) was the most common liquid-based Pap test, with 416 respondents, and 340 respondents interpreted conventional Pap tests (Table 3). Computer-assisted screening was used by 175 responders, 174 of which provided workload volumes. For computer-assisted screening, the median screening rate was 93.1% and the median volume was 31 857 (Table 4). The most common imaging system was the ThinPrep Imaging System.
WORKING GROUPS
Five working groups were organized, and consisted of cytopathologists and cytotechnologists. Members came from private practice, the Veterans Administration, and university practices (Table 5). The working groups were organized to analyze data, compose additional questions for a Web site posting, and integrate the information into draft consensus good laboratory practice statements for gynecologic cytology to be presented and voted upon at the CAP Gynecologic Cytopathology Quality Consensus Conference (GCQC2). The rationale behind each statement was attributed by every working group to the survey, Web site feedback, literature review, professional opinion, or a combination of these, and was clearly presented for every statement at the GCQC2. There were 5 working groups, and the survey topics were divided up among them (Table 6). Working group 1 was charged with interpretive rates, concurrence of cytotechnologist and pathologist diagnoses at time of initial sign-out by pathologist, and turnaround time. Working group 2 covered rescreen of negative for intraepithelial lesion or malignancy Pap tests prior to reporting and retrospective look backs of negative for intraepithelial lesion or malignancy Pap tests prompted by a diagnosis of high-grade squamous intraepithelial lesion or greater. Working group 3 was responsible for proficiency testing, general quality, and workload. Working group 4 was charged with cytologic-histologic correlations, and working group 5 with human papillomavirus rates and testing.
WEB SITE
The CAP Web site was used to post information and data for the gynecologic collaborative project, and the Web site was open to all members of the pathology community. Results of greatest interest and data that required further clarification by asking new questions were posted on the Web site. This process was used to flush out trends or controversies noted by the working groups in their analyses. This was done mainly through questions with proscriptive choices, but many working groups also asked open-ended questions. The ability to solicit open-ended comments complemented the formal written survey, in which many responses were codified to make data analysis possible. Because of limitations of the Web site software, comments were not interactive, but results were summarized and soon posted on the Web site for other participants to comment. New questions, some in response to comments, and additional data were presented during an 8-week period. All responses and comments were anonymous. Because of the limitations of the Web site software, there was nothing to prevent a participant from responding multiple times to the same questions.
GYNECOLOGIC CYTOPATHOLOGY QUALITY CONSENSUS CONFERENCE
The GCQC2 was held on June 4, 2011, in Rosemont, Illinois, and was open to the public and all members of the cytopathology community. The GCQC2 was attended by 98 individuals (Table 5). All 5 working groups presented summary findings and proposed consensus good laboratory practice statements to be voted on by attendees. The conference was interactive, with the audience providing opinion, facts, and comments on quality practices during each presentation. Interactive voting was accomplished through use of an Audience Response System. Each voting attendee was given a Turning Point ResponseCard (Turning Technologies, Youngstown, Ohio) keypad that provided voting results and charts showing the responses. Results of each vote were presented in real-time to the attendees. Voting was recorded, and a consensus was considered when greater than 50% agreement was obtained among the attendees. The interactive style of presentation, comments, and voting resulted in the writing of additional voting statements on new issues that were raised at GCQC2 or rewriting questions when consensus was not obtained on a statement formulated by the responsible working group. After the initial working group presentations were complete, a final round of voting was conducted on new and reworded statements. After the consensus conference, the distribution of responses to the proposed good laboratory practice statements was analyzed. A more formal ranking of the degree of consensus obtained was devised based on the percentage of agreement obtained: 99% to 100% was considered nearly complete agreement, 90% to 98% strong agreement, 80% to 89% moderately strong agreement, and 70% to 79% agreement.
WORKLOAD
No questions were asked concerning cytotechnologist workload requirements during the survey or on the Web site. Preliminary recommendations from the American Society of Cytopathology Productivity and Quality Assurance in the Era of Automated Screening Task Force were presented and voted on at the consensus conference. Its recommendations are presented separately.
COMMENT
There are several strengths and limitations inherent in the process described herein. The strength of this process includes the sheer number of responding laboratories, with 541 responders providing a comprehensive inventory of QA practices currently used in the United States in gynecologic cytopathology. Many of the consensus good laboratory practice statements developed by the working groups are directly derived from this data, and were supported from the literature when available. Furthermore, the consensus good laboratory practice statements were vetted at the consensus conference.
At the conference and within working groups, there was a great deal of sensitivity to the potential impact of these consensus good laboratory practice statements on the resources of cytopathology laboratories and possible regulations arising from the statements. These concerns may have led to the watering down of many of the statements and to a decrease of their future adoption by laboratories.
Reflecting the strengths and limitations of this process, there was an evolution in the development of the nomenclature regarding the term consensus good laboratory practice statements. The term represents a compromise between the term consensus guidelines at one extreme and the term consensus opinions at the other extreme. The former possibly is an overreach from the survey data, which were not, by the nature of the survey, strictly evidence based, whereas the latter diminishes the massive collection of data.
Although the reasoning behind each of the consensus good laboratory practice statements was presented at the consensus conference, many statements lacked clear-cut literature support. Where there were data available in the literature, the literature was not graded on strength of evidence. Another potential concern was that many of the consensus conference attendees who were not members of a particular working group may not have been as familiar with the data as the presenters, potentially resulting in less–well-informed votes.
Another potential problem with the analysis was the issue of laboratories with relatively small volumes of Pap tests, less than 500 annually. It was unclear how the consensus good laboratory practices should be put into practice by these laboratories. Many of the suggestions offered in this series of papers for these laboratories are based on educated guesses, and represent an area ripe for future inquiry.
Notwithstanding these issues, we believe that the good laboratory practices presented in the following series of papers represent a resource for laboratories to use to formally establish a QA program or to fine-tune their QA program in gynecologic cytopathology.
References
Author notes
This report was supported in part by a cooperative agreement (GS-10F-0261K) funded by the Centers for Disease Control and Prevention/Agency for Toxic Substances and Disease Registry.
The authors have no relevant financial interest in the products or companies described in this article.