Context.—Errors entering orders for send-out laboratory tests into computer systems waste health care resources and can delay patient evaluation and management.
Objectives.—To determine (1) the accuracy of send-out test order entry under “real world” conditions and (2) whether any of several practices are associated with improved order accuracy.
Design.—Representatives from 97 clinical laboratories provided information about the processes they use to send tests to reference facilities and their order entry and specimen routing error rates.
Results.—In aggregate, 98% of send-out tests were correctly ordered and 99.4% of send-out tests were routed to the proper reference laboratory. There was wide variation among laboratories in the rate of send-out test order entry errors. In the bottom fourth of laboratories, more than 5% of send-out tests were ordered incorrectly, while in the top fourth of laboratories fewer than 0.3% of tests were ordered incorrectly. Order entry errors were less frequent when a miscellaneous test code was used than when a specific test code was used (3.9% vs 5.6%; P = .003).
Conclusions.—Computer order entry errors for send-out tests occur approximately twice as frequently as order entry errors for other types of tests. Filing more specific test codes in a referring institution's information system is unlikely to reduce order entry errors and may make error rates worse.
Errors entering laboratory test orders into computer systems waste health care resources and can delay patient evaluation and management.
Previous Q-Probes studies conducted by the College of American Pathologists (CAP) have described error rates for both inpatient and outpatient computer order entry. In a study of inpatients served by 577 clinical laboratories, study participants at the median facility could not identify written requests for 0.7% of tests that had been ordered in the laboratory computer system, and 1.9% of written orders were never entered into the computer.1 In the outpatient environment, order entry accuracy rates were similar; in a study of outpatient orders received by 660 laboratories, participants at the median facility could not identify written requests for 1.0% of tests ordered in the laboratory computer system, while 1.0% of written orders were never entered into the computer.2
Accurate computer order entry of tests that are being referred to a reference facility (“send-out tests”) presents several special challenges above and beyond those encountered with routine test order entry: (1) Send-out tests performed by reference laboratories are typically more expensive than tests performed in primary clinical laboratories, which means that incorrect order entry of send-out tests is generally more wasteful than incorrect order entry of “routine” tests. (2) When a laboratory uses more than one reference facility, incorrect test order entry can cause specimens to be routed to the wrong laboratory, which results in longer delays and more rework than a typical order entry error. (3) In most facilities the menu of tests that are sent out is much larger than the menu of tests performed in-house, increasing the potential for mix-ups. (4) Finally, clinicians may be less familiar with infrequently ordered send-out tests than with tests performed in local laboratories, leading to ambiguity when orders are initially written.
This study was designed to evaluate the fidelity with which physicians' orders for send-out tests are entered into computer systems and whether send-out tests are routed to correct reference laboratories. The study did not address the accuracy with which the results of send-out tests are entered into laboratory information systems (which is often done manually and can also be problematic), the clinical appropriateness of orders for send-out tests, or the clinical interpretation of send-out test results. The specific aims of this study were 2-fold: (1) to describe the accuracy of send-out test order entry under “real world” (field) conditions in a relatively large group of laboratories, and (2) to determine whether any of several order entry practices were associated with improved order accuracy. The order entry practices we examined are defined in “Materials and Methods,” and included the use of specific versus miscellaneous test codes to describe send-out tests, the presence of an interface between the primary and reference laboratory computer systems, creation of a separate send-out section within the laboratory, and other variables believed to improve the fidelity of the order entry process.
MATERIALS AND METHODS
The study was conducted according to the Q-Probes study format previously described, which relies on a convenience sample of clinical laboratories that subscribe to the CAP Q-Probes benchmarking program.3 After refinement of a standardized data collection instrument, CAP Q-Probes subscribers were mailed data collection instructions in late 2006.
Participants retrospectively reviewed send-out test orders until 5 order entry errors were identified or 300 laboratory requisitions had been reviewed. A minimum of 50 send-out test orders were reviewed, and a maximum of 5 send-out test orders per day could be included in the study. At the median facility, 176 send-out test orders were reviewed (range, 50–300 orders). For each send-out test order included in the study, laboratory requisitions and the corresponding report were obtained (see “Definitions,” below). Participants determined whether the test performed and reported by the reference facility matched the send-out test ordered on the requisition, whether the test was sent to the correct reference laboratory, and whether a miscellaneous or specific test code was used to describe the test in the referring laboratory's information system. No particular type of send-out test was excluded from the study.
Participants were queried about several institutional characteristics: occupied bed size, teaching status, pathology resident training status, government affiliation, institution location, institution type, CAP inspection status, and inspection status by the Joint Commission on Accreditation of Healthcare Organizations. They were also queried about practices used to handle send-out tests, as described below.
All but 1 (99.0%) of the 97 participating institutions were located in the United States, with the remaining institution located in Australia. Approximately 34% of participating institutions were teaching hospitals, and approximately 19% had a pathology residency program. Within the past 2 years, the CAP inspected more than 85% of the laboratories. Tables 1 and 2 list characteristics of participating institutions.
To ensure uniformity of responses, several definitions were adopted:
Correct Send-Out Laboratory
The correct send-out laboratory was the laboratory with the capacity to perform a send-out test that was designated by the referring laboratory or the ordering physician as its reference laboratory for the particular test that had been ordered. If ordering physicians were not permitted to specify where a test should be directed, the correct send-out laboratory was the facility to which the referring laboratory normally sent specimens for the test in question. If all send-out tests were directed to a core laboratory and then, in turn, sent on to an external reference laboratory, the correct send-out laboratory represented the testing facility at which the specimen was ultimately to be tested, not the core laboratory.
A piece of paper sent to the laboratory that was used to specify test orders. The laboratory requisition could contain a preprinted “test menu,” be a copy of a handwritten order sheet from the medical record, or be a printout from a computer that was not interfaced with the laboratory computer, such as a printout from a physician's office electronic medical record.
Miscellaneous Test Code
A single test code filed in a laboratory or hospital information system that was used to order all tests for which specific test codes have not been built (see “Specific Test Code”). Typically, low-frequency tests are all ordered using the same miscellaneous or generic test code. The name of the desired test is then entered into a free text field in the laboratory or hospital computer or handwritten on a piece of paper that accompanies the electronic order for the miscellaneous test code.
The page or pages in a patient's medical record (or a single record in an electronic medical record) where a physician writes orders for laboratory tests.
Any individual recognized as having test ordering privileges. This typically included licensed doctors and dentists and individuals acting under a physician's direct supervision (eg, nurse practitioners and physician's assistants).
Any laboratory that received send-out tests from a participating study laboratory and that did not use the same computer system as the referring laboratory. Send-out laboratories are also called reference laboratories. The computer systems of the send-out laboratory and referring laboratory may have been interfaced, but they must have been separate systems that were run on separate hardware and were maintained by a separate team of individuals (eg, the 2 computer systems could not consist of interfaced central processing units that were housed in the same computer room and maintained by a common set of computer operations staff).
Any test sent to another laboratory licensed by the Clinical Laboratory Improvement Amendments of 1988 that was not using the same computer system as the referring laboratory. This might be a third-party reference laboratory with which the participating institution had a contractual relationship or a core laboratory owned by the same health care system as the participant (but which operated on a different computer system from the referring laboratory).
Specific Test Code
A test code filed in a laboratory or hospital information system that described a specific test, such as “sodium” or “IgE.” Two different tests never share the same specific test code. A separate specific test code is filed for all commonly ordered tests.
Prior to performing statistical tests for association, values were screened for outliers. Staff at several participating institutions did not answer all of the questions on the questionnaire about demographic characteristics or institutional practices. These institutions were excluded only from tabulations and analyses that required the missing data elements. All statistical analyses were performed using SAS v9.1 (SAS Institute Inc, Cary, NC).
The key quality variables—the rate of accurate send-out test orders and the rate of send-out tests routed to the proper reference laboratory—were tested for associations with the institutions' demographic and practice variable information in Tables 1 and 2 using the nonparametric Kruskal-Wallis test; P < .05 was considered statistically significant.
Participants from 97 institutions reviewed a total of 17 904 send-out tests. Of these, 17 546 (98.0%) send-out test requests were correctly ordered into the laboratory information system. At 95 institutions, specimens were routed to the proper send-out test laboratory 99.4% of the time (17 292 tests correctly routed of 17 404 reviewed send-out tests). Information about specimen routing was not available for 2 study institutions.
The distributions of the 2 quality indicators are listed in Table 3. At the median institution, 1.7% of send-out test requests were incorrectly ordered. At one fourth of institutions at least 5.6% of send-out test requests were incorrectly ordered, and at one tenth of institutions at least 10% of tests were incorrectly ordered. Almost all specimens were routed to the correct send-out laboratory. Participants at the median institution reported no instances of specimen misrouting.
We hypothesized that the use of miscellaneous test codes by the referring laboratory would be associated with higher order entry error rates. Somewhat surprisingly, the fraction of order entry errors tended to be lower when a miscellaneous test code was used than when a specific test code was used (Table 4, 3.9% vs 5.6%; P = .003 for all institutions combined).
When the demographic and practice variables in Tables 1 and 2 were tested for association with the rate of accurately ordered and accurately referred send-out tests, no statistically significant associations were found. Specifically, we could detect no relationship between ordering accuracy and institutional demographic variables, send-out test volume or percentage, use of a core laboratory, reference laboratory interfaces, use of a dedicated send-out area, the practice of allowing physicians to specify the reference laboratory where testing was to occur, the presence of specific training in the send-out area, monitoring of order entry rates, distribution of a test catalogue that included send-out tests, or use of ordering mnemonics to speed the order entry process.
In this study of computer order entry of send-out tests at 97 laboratories, 98% of send-out tests were correctly ordered. The order entry error rate observed in this study was approximately twice the order entry error rate reported for routine inpatient and outpatient tests in previous investigations.1,2 This difference did not surprise us, given the hundreds of different send-out tests that can be ordered and the relative infrequency with which many individual send-out assays are requested.
There was wide variation among laboratories in the rate of send-out test order entry errors. Order entry staff at the bottom fourth of laboratories made order entry errors on more than 5% of send-out tests, while in the top fourth of laboratories order entry errors occurred with fewer than 0.3% of tests—a 10-fold difference. Given the high average cost of send-out tests and the low reimbursement from many payors, the financial implications of this difference are not trivial.4 This study was not designed to assess the clinical implications or patient inconvenience associated with delays in obtaining the results of the send-out tests that clinicians had originally requested.
We examined a number of practices that had the potential to reduce send-out order entry errors but could detect no impact of any of the practices on order entry error rates. Specifically, we observed no reduction in order entry errors when all send-out tests were released from a core laboratory or a specialized send-out area, send-out staff were dedicated to the send-out task or received special training, reference laboratory computers were electronically interfaced with the computer systems of the referring laboratory, send-out order accuracy was regularly monitored, requisitions included ordering mnemonics for common send-out tests, or the laboratory provided a test catalogue/directory that lists send-out tests. Our study was not powered to detect a small impact that one of these practices may have on ordering accuracy, but large differences would most likely have been detected if they were present.
We had hypothesized that the use of specific test codes would decrease the frequency of order entry errors, compared with the use of a “miscellaneous” (generic) test code in which the name of the requested test is added as “free text.” Yet this did not appear to be the case. Sites that filed more specific test codes in their laboratory computer system for send-out tests did not have lower error rates than sites that relied more heavily on a miscellaneous test code. In fact, in a test-centered analysis, requests that had been ordered using a miscellaneous test code were more likely to be correctly performed (96.1%) than examinations ordered using a specific test code (94.4%). We cannot explain this finding, which runs counter to what we believe is accepted wisdom in the laboratory industry. We speculate that when specific test codes are used, “front-end” (first line) laboratory order entry staff are forced to interpret ambiguous written orders and select the appropriate specific send-out test code, whereas when miscellaneous codes are used the staff in reference laboratories selects the specific test to ultimately order. Individuals who work in reference laboratories are more familiar with esoteric test menus and may make fewer order entry errors. Finn et al5 reported that experienced laboratory staff may improve upon physicians' initial orders; in the case of thyroid testing, staff tended to modify orders to make them more clinically appropriate. This observation may be pertinent to send-out tests, although Wong and Nelson6 pointed out that the names of the various thyroid function tests confuse many clinicians and that studies of thyroid test orders may not be applicable to other tests. Clearly, more research is needed to settle this question.
In aggregate, 99.4% of send-out tests were routed to the proper reference laboratory. Staff at most laboratories routed all of their send-out tests to the proper facility, even though the median laboratory used 5 different reference laboratories. Laboratories at which physicians were allowed to specify a send-out laboratory for their testing did not have higher rates of misrouting than other laboratories. Our study data suggest that routing of send-out specimens does not appear to be a significant problem in most laboratories.
Several limitations of this study should be acknowledged, many of which are also applicable to other Q-Probes studies: (1) First, data from study participants were self-reported and we could not independently validate all of the data submitted. (2) Second, in the outpatient/outreach arena in the United States, approximately 30% of testing is performed by commercial laboratories that were not represented in this study. (3). Third, in most study institutions fewer than 200 send-out test orders were examined, and the precision with which an institution's individual error rate was ascertained was limited by this sample size. With a sample size of 10 errors in 200 requisitions, the 95% confidence interval on the order entry error rate is between 2% and 8%. While aggregate order entry rates reported in this article are likely to be accurate, because of the large total number of orders included in the study, estimates of order entry error rates for individual institutions were subject to larger sampling error. (4) Finally, the 97 laboratories in this study may not be representative of hospital-based laboratories. A decision by laboratory management to participate in this study may have reflected increased concern about local order entry accuracy or an unusually strong commitment to ensuring the fidelity of preanalytic processing.
It is difficult to make specific evidence-based recommendations to laboratory managers interested in improving the accuracy of send-out test ordering because of the limitations listed above and our inability to identify practice variables statistically associated with lower order entry error rates. Laboratories at which send-out test order entry error rates are greater than 3% have performance worse than the median facility. Management at these institutions should consider double checking all orders for send-out tests since this practice has been associated with reduced order entry errors in other studies.1,2 Our data also suggest that caution be exercised before assuming that the mere filing of more specific test codes will reduce order entry errors, because doing so may force ill-equipped front end order entry staff to select the right examination from a lengthy list. Use of a miscellaneous test code (plus a free text test description) allows this decision to be made by more experienced order entry staff located at reference facilities. This conclusion has important practical ramifications and deserves further study. Finally, it should not be assumed that the introduction of direct physician order entry will reduce the frequency of test order entry errors for send-out tests. Introduction of direct physician order entry has increased medication errors in some studies.7 If physicians are less adept than referring laboratory or reference laboratory order entry staff with computer order entry of esoteric send-out tests, direct physician order entry may increase order entry error rates.
The authors have no relevant financial interest in the products or companies described in this article.
Reprints: Paul N. Valenstein, MD, Department of Pathology, St Joseph Mercy Hospital, 5301 E Huron River Dr, Ann Arbor, MI 48106-0995 (firstname.lastname@example.org)