Delta checks serve as a patient-based quality control tool to detect testing problems.
To evaluate delta check practices and outcomes.
Q-Probes participants provided information about delta check policies and procedures. Information about investigations, problems, and corrective actions was prospectively collected for up to 100 testing episodes involving delta check alerts.
Among 4505 testing episodes involving 6541 delta check alerts, the median frequencies of actions taken among 49 laboratories were clinical review, 38.0%; retest, 25.0%, or recheck, 20.2%; current specimen, nothing, 15.4%; analytical check, 5.0%; other; 2%; and retest or check previous specimen, 0%. Rates of any action taken by analyte ranged from 84 of 179 (46.9%) for glucose to 748 of 868 (86.2%) for hemoglobin and potassium. Among 4505 testing episodes, nontesting problems included physiologic causes (1472; 32.7%); treatment causes (1318; 19.2%); and transfusion causes (846; 9.9%). Testing problems included 77 interference (1.7%), 62 contamination (1.4%), 51 clotting (1.1%), 27 other (0.6%), 12 mislabeling (0.3%), and 5 analytical (0.1%). Testing problems by analyte ranged from 13 of 457 (2.8%) for blood urea nitrogen to 12 of 46 (26.1%) for mean corpuscular hemoglobin concentration. Using more delta check analytes was associated with detecting more testing problems (P = .04). More delta check alerts per testing episode resulted in more actions taken (P = .001) and more problems identified (P < .001). The most common outcome among 4500 testing episodes was reporting results without modifications or comments in 2512 (55.8%); results were not reported in 136 (3.0%).
Actions taken in response to delta check alerts varied widely, and most testing problems detected were preanalytical. Using a higher number of different analytes and evaluating previous specimens may improve delta check practices.
The delta check is a measurement of the difference between a patient's sequential test results. A larger-than-expected interval change in results may indicate a testing problem associated with either the former or the current specimen and prompt an investigation before results are reported. Delta checks are widely used in clinical laboratories as a patient-based quality assessment tool to detect errors associated with either specimen collection, analysis, or reporting problems and provides a safety net for identifying testing errors that might otherwise go unnoticed.1,2 Delta checks are also an important component of autoverification procedures that improve laboratory efficiency.3 However, delta checks have some limitations. A narrow group of analytes that have low physiological variation are typically selected to reduce the proportion of false alerts. Delta checks are also primarily restricted to only tests that tend to be frequently repeated during relatively short periods of time. As a result, delta checks are principally confined to hospital or other settings (eg, dialysis) in which patients are repeatedly tested.
Investigation of potential testing errors triggered by a delta check alert requires extra work for laboratory staff and may delay reporting results.4 Because a large proportion of delta check alerts are not associated with any identified error,1,2,5 it is important to select analytes and set parameters that are as effective as possible for detecting testing problems. Although several studies have investigated the use of delta checks for identifying mislabeled specimens,1,4,6 information is lacking about routine delta check practices such as the types of tests commonly used, calculation methods, and sensitivity for detecting problems other than labeling errors. This Q-Probes study provides new information for understanding delta check practices, the effectiveness of delta check alerts for detecting problems, and the effects of delta check alerts on testing outcomes among a diverse group of clinical laboratories.
METHODS
This observational study was conducted in 2014 according to the Q-Probes format as previously described, which involved laboratories that subscribe to and participate in the program.7 Instructions and data collection materials were provided and were returned to the College of American Pathologists by a specified time. After data analysis, subscribers received an individual detailed report about their performance on quality indicators benchmarked against other participants. The range of performance among participants, summary analysis of significant associations with demographics and practice variables, and a critique report were sent to each participating institution.
Participants were instructed to prospectively collect delta check alerts for up to 60 days or 100 testing events, whichever occurred first. Delta checks on results from point-of-care tests or from body fluid specimens were excluded. Delta checks used for purposes other than quality control or troubleshooting, such as reflex testing or duplicate test cancellations, were excluded. Assessment of discordant results that occurred within the same specimen (eg, discrepant hemoglobin/hematocrit or blood urea nitrogen/creatinine values) that triggered troubleshooting or corrective actions was not included in this study.
Participants were asked to provide information about analytes that were being used by their laboratories for delta checks. This information included the maximum time interval used for calculations, the type of calculation (absolute, percentage, or rate of change), the amount of change needed to trigger an alert, and whether all or a limited range of results were used for calculations. Participants used a worksheet (Table 1) to record information about specific investigations, problems identified, and corrective measures taken in response to consecutive testing episodes that involved one or more delta check alerts.
Worksheet Categories for Actions, Problems, and Outcomes Used by Participants for Collecting Information About Delta Check Alerts

Abbreviations: IV, intravenous; LIS, laboratory information system.
Participants provided general information about delta check practices by completing a questionnaire. This included methods for developing delta check parameters, role of the laboratory director, use of delta checks in the laboratory's system for quality assurance, frequency of reviewing delta check criteria, general procedures for handling delta check alerts, use in autoverification processes, and overall opinion about the value of delta checks. If a question was left unanswered, the participant was excluded from the analysis for that question. Finally, participants provided institutional demographic information that included occupied bed size, government affiliation, location, and type of institution.
Data were tabulated and analyzed by the biostatistics department of the College of American Pathologists. Statistical analyses were performed to determine factors significantly associated with demographics, delta check actions, testing problems, and outcomes. Associations were analyzed using Kruskal-Wallis tests for discrete-valued independent variables and regression analysis for the continuous independent variable. Variables with significant associations (P < .1) were then included in a forward-selection multivariate regression model that also controlled for the number of analytes with delta checks. A significance level of P < .05 was used for this final model. Action-specific rates were also tested for association with use of a checklist and having written criteria for handling delta checks using the Wilcoxon rank sum test. Analysis of actions taken and problems identified per number of delta checks involved with a single testing episode was performed by Mantel-Haenszel χ2 test. All analyses were performed with SAS 9.2 (SAS Institute, Cary, North Carolina).
RESULTS
A total of 49 facilities participated in this study. A summary of institutional demographics for 47 reporting facilities is shown in Table 2. Results from the general questionnaire are shown in Table 3. Among 46 participants, one or more methods used for determining delta check limits were reported, including the literature (n = 37; 80.4%), laboratory director (n = 34; 73.9%), technical experience (n = 22; 47.8%), medical staff consultation (n = 19; 41.3%), simulation or review of historical laboratory data (n = 15; 32.6%), biological variation (n = 11; 23.9%), reference range (n = 7; 15.2%), and other (n = 3; 6.5%). When asked if delta checks were less useful now than in the past because of improvements in laboratory instruments and specimen identification procedures, 18 (39.1%) and 9 (19.6%) of 46 study participants agreed, respectively.
Analytes and Calculations Used for Delta Checks
Among 46 participants, the median number of different analytes used for delta checks was 15 and ranged from 6 to 32 for laboratories within the 10th to 90th percentile range. Table 4 shows the distribution of analytes and delta check calculations if reported by 10 or more participants. Other analytes used for delta checks included osmolality (n = 9 participants); red blood cell relative distribution width, anion gap, and lactate (n = 8); creatine kinase MB isoenzyme and prostate-specific antigen (n = 7); free calcium (n = 6); and D-dimer, hemoglobin A1c, high-density cholesterol, and reticulocyte count (n ≤ 5). Absolute differences and percentage change were used for delta check calculations; no participant used rate-of-change calculations for any analyte. Use of delta checks for some analytes was restricted within a specific range of results by a small number of participants (Table 4).
Delta Check Analytes and Criteria Used by Study Participants (N = 46)

Abbreviations: AST, aspartate aminotransferase; BUN, blood urea nitrogen; CO2, carbon dioxide; INR, international normalized ratio; MCH, mean corpuscular hemoglobin; MCHC, mean corpuscular hemoglobin concentration; MCV, mean corpuscular volume; WBC, white blood cell.
SI conversion factors: To convert bilirubin, BUN, calcium, cholesterol, creatinine, glucose, magnesium, and phosphorus to millimoles per liter, multiply by 0.0555.
Delta check calculation restricted to a specific range of results.
Median change not reported for fewer than 6 reporting facilities.
Actions Taken in Response to Delta Checks
A total of 4514 testing episodes were reported that involved 6541 delta check alerts. Most of these involved a single analyte, although 885 (19.6%) involved 2 analytes, 255 (5.6%) involved 3 analytes, 102 (2.3%) involved 4 analytes, and 63 (1.4%) involved 5 or more analytes. Among 4508 testing episodes involving one or more delta check alerts, at least one action was taken in 3411 (75.7%). Table 5 shows the types of actions by specific analytes for which at least 20 evaluations, in aggregate, were undertaken in response to a delta check alert. Overall, a total of 5892 different investigations were reported by participants, of which the most common was clinical review, followed by inspection and repeat testing of the current specimen. In contrast, inspection and repeat testing of the initial specimen involved with a delta check alert was much less common (Table 5). Distribution of actions for all analytes involving delta check alerts within each laboratory is shown in Table 6.
Aggregate Actions Taken in Response to Delta Check Alerts by Analyte

Abbreviations: AST, aspartate aminotransferase; BUN, blood urea nitrogen; CO2, carbon dioxide; INR, international normalized ratio; MCHC, mean corpuscular hemoglobin concentration; MCV, mean corpuscular volume; PT, prothrombin time; RDW, red blood cell relative distribution width; WBC, white blood cell.
Distribution of Specific Actions Taken Within Each Laboratory as Percentage of All Delta Check Alerts

Percentage of actions taken per testing episode involving one or more delta checks.
A median of 15 analytes were used for delta checks among 46 laboratories. Participants (n = 17) who used a relatively higher number of different analytes (>20) for delta checks reported taking one or more actions more frequently (median = 87.0%; 10th–90th percentile = 41.0%–100%) in response to a delta check alert compared with the median of actions taken 78.8% (10th–90th percentile range = 19.6%–100%) of the time for 29 laboratories that used fewer than 20 analytes. This difference did not reach statistical significance, P = .18, Kruskal-Wallis test. However, there was a significant association between the quantity of delta checks triggered per testing episode and the number of different actions taken, P = .001, Mantel-Haenszel χ2 test (Figure 1).
Association between number of delta check alerts per testing episode and actions taken.
Association between number of delta check alerts per testing episode and actions taken.
The median rate of any actions taken in response to a delta check alert was 92.4% in laboratories (n = 8) that used a checklist compared with 84.3% of those (n = 38) that did not, P = .52. The median rate of any action taken in response to a delta check alert was 92.4% in laboratories (n = 34) that had written criteria for handling delta check alerts compared with 78.0% of those participants (n = 11) that did not, P = .09, Kruskal-Wallis test.
Problems Identified by Delta Checks
The distribution of rates of testing problems per testing event by facility is shown in Table 7. One or more testing problems were identified in 217 of 4505 testing episodes involving delta check alerts (4.8%). Overall, 240 separate testing problems were reported by participants, including 183 (76.3%) and 57 (23.8%) involving the current and previous specimens, respectively (Table 8). Other potential problems associated with delta check alerts listed on the study worksheet that were not reported for any testing event included clerical errors (not related to data entry), interface/data transmission errors, specimen transport delays, mislabeled aliquots, and wrong specimen containers. Nearly all testing problems involved specimen collection issues (eg, hemolysis, contamination with intravenous fluids, or clotting). Only 12 mislabeled specimens were identified by delta check alerts. One case involved labeling errors on both the previous and current specimens. Among 18 different delta check alerts involving mislabeling errors, the most frequent analyte was hemoglobin (n = 6), followed by mean corpuscular volume (MCV; n = 4) and sodium (n = 3). The frequency of analytical errors detected by delta check alerts was 1.1 per 1000 testing episodes.
Distribution of the Percentage of Delta Check Alerts That Involved a Testing Problem Among Each Facility (N = 49)

Aggregate Testing Problems Associated With Delta Check Alerts

Abbreviation: IV, intravenous.
Table 9 shows the frequency of testing problems identified by individual analytes for which there were at least 25 delta check alerts. In a few cases, problems were identified in both previous and current specimens. Participants who used a relatively higher number of different analytes (>20) for delta checks (n = 17) uncovered a significantly larger number of testing problems (median = 5.0%; 10th–90th percentile = 1.0%–16.1%) compared with a median of 3.1% (10th–90th percentile range = 0.0%–8.1%) for 29 laboratories that used fewer than 20 analytes, P = .04, Kruskal-Wallis test. In addition, the likelihood of identifying a testing problem was significantly associated with a higher number of delta check alerts per testing episode, P < .001, Mantel-Haenszel χ2 test (Figure 2).
Aggregate Testing Problems Suspected or Identified With Delta Check Alerts by Analyte Among All Facilities

Abbreviations: AST, aspartate aminotransferase; BUN, blood urea nitrogen; INR, international normalized ratio; MCHC, mean corpuscular hemoglobin concentration; MCV, mean corpuscular volume; PT, prothrombin time; RDW, red blood cell relative distribution width; WBC, white blood cells.
Some testing problems were identified in both previous and current specimens.
Association between number of delta check alerts per testing episode and testing problems identified.
Association between number of delta check alerts per testing episode and testing problems identified.
The median rate of detecting testing problems in response to a delta check alert was 3.7% in facilities (n = 8) that used a checklist compared with 4.0% of those (n = 38) that did not, P = .52. Further, the median rate of detecting testing problems in response to a delta check alert was 3.1% in facilities (n = 34) that had written criteria for handling delta check alerts compared with 5.0% of those participants (n = 11) that did not, P = .46, Kruskal-Wallis test.
Outcomes Associated With Delta Checks
Among 4500 testing episodes involving one or more delta check alerts, participants reported 4652 outcomes (Table 10). Most outcomes involved taking no additional action other than adding a comment to the laboratory report. This was reported in about one-third of testing episodes. In addition, 369 results involving delta check alerts were reported as expected outcomes based on various patient conditions. The remaining 265 outcomes (5.9%) involved various corrective actions in which a testing problem was identified. In 136 cases (3.0%), results were not reported. One or more delta check alerts persisted in 50 of 72 recollected specimens (69.4%). Table 11 shows the frequency of outcomes involving corrective actions taken in response to testing problems by specific analyte involved with delta check alerts.
Aggregate Outcomes Resulting From Delta Check Alerts Among All Facilities

Includes multiple responses per testing episode.
Delta Check Alerts Leading to Corrective Action for Suspected or Identified Testing Problems by Specific Analyte

Abbreviations: AST, aspartate aminotransferase; BUN, blood urea nitrogen; INR, international normalized ratio; MCHC, mean corpuscular hemoglobin concentration; MCV, mean corpuscular volume; PT, prothrombin time; RDW, red blood cell relative distribution width; WBC, white blood cell.
DISCUSSION
The use of delta check rules in laboratory medicine as a patient-based quality control method was introduced by Lindberg in 1967 as a new concept related to emerging technology in laboratory informatics.8 Application of this tool became practical in the late 1970s with the development of laboratory information systems that were capable of handling delta check calculations and alerting testing personnel about exceptions before results were reported.9–11 Besides quality assessment, delta checks have also been designed to detect conditions that may require immediate medical attention, such as a rapid rise in serum creatinine with acute kidney injury.12 However, use of delta checks for this purpose was not examined in this study.
Delta checks are now widely used by laboratories as a quality control technique. For example, among 37 study participants, 34 (91.9%) reported using delta checks as a part of their autoverification procedures. However, there is limited information about the effectiveness of delta check procedures for detecting problems other than specimen identification errors. Furthermore, the value of this quality control tool has been questioned.5 For example, among 46 participants responding to the study questionnaire, 18 (39%) believed delta checks were less useful than in the past because of improvements in laboratory instrumentation, although only 9 (20%) thought that delta checks had become less useful for detecting misidentified specimens. Concerns about the use of delta checks may be warranted because guidance for selecting the most suitable analytes, criteria for actionable delta check thresholds, and guidance about procedures most effective for detecting problems is limited.13 Studies primarily involve recommendations based on theoretical assessments and statistical models using estimates of biological variation and reference ranges or simulations involving probabilities of detecting specimen mislabeling or mix-up errors.14–18 In spite of sparse information available about selecting delta check parameters, the most common method for setting delta check limits reported by participants was from the literature.
This Q-Probes study was undertaken to help participants better understand delta check practices and identify potential improvements. Results from this study also have wider applicability and further expand knowledge about the use of delta checks, including the types of analytes and calculations used in routine practice, the frequency and types of investigations performed, and types of problems identified in response to delta check alerts.
Analytes and Calculations Used for Delta Checks
The median facility used 15 different analytes for delta checks, but this number varied widely among facilities. An important finding observed in this study was that the use of a higher number of different delta check analytes detected significantly more problems but did not affect the frequency of actions taken. This may have been caused, in part, by a single problem triggering delta checks on multiple tests, which would have involved the same amount of time and effort to investigate as a single delta check alert. About 30% of testing episodes triggering a delta check involved multiple analytes. A higher number of different analytes used for delta checks would be expected to increase the sensitivity of detecting problems, although the quantity of false alerts might also be higher.
As expected, the most common analytes used for delta checks involved tests that were used for monitoring and that were frequently repeated, such as those in electrolyte and hematology panels. Other analytes such as cholesterol, prostate-specific antigen, or hemoglobin A1c, which are infrequently retested during short (<7 days) periods of time, have less application for delta checks and were understandably used by only a few participants.
Although the use of delta check calculations involving rate of change or multivariate methods has been described,6,19,20 all study participants used only absolute and percentage change calculations. The median changes reported by participants for absolute and percentage changes were similar to criteria recommend by Kim et al,21 except these investigators suggested using rate of change for enzymes and renal function tests (creatinine and blood urea nitrogen). Studies that have recommended maximum time intervals for calculating delta checks have ranged from 2 to 5 days21 to 5 to 7 days,22 which is comparable to the median range of times (3–7 days) used by study participants for various analytes. More complex delta check calculations may not have been used by participants because of insufficient support by laboratory information systems, added difficulty, or unproven value. Lacking guidelines or strong supportive evidence, variation in preferences for delta check calculations was expected.19
Actions
The likelihood of uncovering a testing problem or another reason that triggers a delta check alert depends, in part, on thoroughness of the investigation, which may require a significant amount of time and effort. It also delays reporting results. A worksheet for documenting evaluations was provided to study participants, which may have biased results toward more rigorous evaluations. Nevertheless, there was substantial variation in the extent and types of actions taken in response to a delta check alert. Surprisingly, nearly a quarter of all testing episodes involving a delta check alert led to no investigation at all, although this was highly variable among laboratories, ranging from taking some action nearly all the time by some laboratories to checking less than 20% of testing events by others. Use of standardized procedures, a checklist, or a flow chart would be expected to improve standardization and promote thoroughness of investigations. Some participants used a checklist and many reported having written procedures for handling delta check alerts, but this was not associated with frequency of investigations or problems identified. This could be due to study sample size, compliance with procedures, or their effectiveness. These observations suggest that additional guidance and standardization for handling investigations to detect testing errors associated with delta check alerts is needed.23
The quantity of different analytes that triggered delta check alerts during any testing episode was significantly associated with the total number of actions taken and testing problems identified (Figures 1 and 2). These results support using higher numbers of different analytes for delta checks. It is also consistent with previous studies that show added value by use of multivariate delta check methods.6 However, these types of calculations were not used by any of the participants in this study, and may therefore have less practical value than expected, perhaps because of limited capabilities of current laboratory information systems. Nevertheless, innovative methods, using complex formulas on multiple analytes, show promise for improving the value of delta checks, especially for detection of mislabeling problems.24
The most frequent action taken in response to a delta check alert involved clinical review. This finding was expected because the most common problems associated with delta check alerts involved clinical conditions rather than laboratory testing errors. The second most frequent action was retesting the current specimen. Studies involving repeat testing to confirm critical value results show that this method rarely identifies analytical problems. For example, the likelihood of a significantly different result upon retesting for a critical serum potassium value is 1.9 per 1000 tests.25 Similarly, repeat testing to detect analytical errors in response to a delta check alert also has low productivity, as shown in another study in which the analytical error rate was 2.8 per 1000 repeat chemistry tests involving delta check alerts.4 This is similar to the analytical error rate of 1.1 per 1000 testing events (not tests) reported in this study under less controlled conditions. However, analytical errors were probably underestimated because current specimens were retested only 25% of the time and previous specimens much less often. Relative to other types of evaluations, retesting specimens detected very few testing errors. Checking the specimen for labeling or other problems (eg, hemolysis) was the third most common action in response to delta check alerts and had more impact on detecting errors because these were more common problems than analytical measurement errors, which are primarily identified by retesting.
Because a problem might be equally likely in either specimen involved with a delta check alert, evaluating both should be done whenever possible. Nevertheless, retesting the current specimen was done 42 times more often than retesting the prior specimen, and the current specimen was checked for labeling or other problems 31 times more often. This occurred in spite of a worksheet provided to study participants that included a separate area to document actions taken on the previous specimen. The reasons why previous specimens were less often evaluated for errors are unknown. This could have been due, in part, from known problems with reliability of retesting due to specimen stability, storage, or length of retention conditions, especially for certain analytes such as carbon dioxide. Other factors, such as inconvenience of finding archival specimens or lack of standardized protocols for investigating delta check alerts, may have also played a role. In spite of the disproportionate attention given to evaluating only the most current specimen involved with a delta check alert, 23.8% of all testing problems were associated with the prior specimen. In a study4 involving 9831 specimens retested because of delta check alerts, only the current specimen was evaluated for analytical errors. It is therefore likely that testing problems detected by delta check alerts are underestimated by this and previous studies. These results also suggest that in practice, some testing errors go undetected because of not evaluating the initial specimen, which lessens the effectiveness of delta checks. This observation indicates an opportunity to improve laboratory practices by developing protocols that include, whenever possible, assessment of both specimens involved with a delta check alert.
Problem Detection
Previous studies have shown that a testing problem is uncovered in about 0.4% to 6.0% of testing episodes involving a delta check alert.9,21 This is comparable to the aggregate (4.8%) and median (4.0%) laboratory rates observed in this study and falls within the 80th central percentile range (0.0% and 9.0%) reported by all facilities. Some variation is likely related to laboratory practices. For example, about one-quarter of participants did not have written criteria for handling delta check alerts. Problem detection rates also varied widely among analytes used for delta checks. Among chemistry tests, certain electrolytes (sodium, potassium, and magnesium) showed the highest problem detection rates. In contrast, tests for renal function (creatinine, blood urea nitrogen) were relatively insensitive, presumably because of higher relative frequency of effects on these tests from treatment (eg, dialysis). Among the hematology analytes, delta check alerts involving red blood cell relative distribution width, mean corpuscular hemoglobin concentration, and mean corpuscular hemoglobin were more effective than those involving hemoglobin or platelet counts. Calculated mean corpuscular hemoglobin and mean corpuscular hemoglobin concentration indices on automated cell counters can be falsely elevated and trigger a delta check alert because of their heightened sensitivity from interference in hemoglobin measurements caused by lipemia or by falsely low red blood cell counts caused by cold agglutinins.13 An interesting finding among coagulation tests showed that problem detection rates associated with delta check alerts were much higher for the international normalized ratio compared with prothrombin time, from which the international normalized ratio is derived. International normalized ratio results are especially prone to be falsely elevated because of specimen collection errors, which would more likely trigger delta checks.26 Further, international normalized ratio was the only delta check analyte associated with more identified testing problems in previously collected specimens compared with current ones, even though there was a much higher rate of checking current specimens.
The variation in sensitivity among different analytes to detect testing errors by delta checks might involve several factors. These include differences in time interval and cutoff thresholds used, intensity of investigations, inherent properties (biological stability, analytical precision), and relative frequency of nontesting causes (eg, treatment). This information has practical value for selecting analytes to use for delta checks because some were observed to be more effective than others for identifying testing problems.
Nearly all testing errors detected by delta check alerts in this study were in the preanalytical phase of testing, including interference by hemolysis, lipemia, or icterus; contamination by intravenous fluids; and effects of clotting. Only 5 of 240 testing problems (2.1%) or 0.04% of all 4514 testing episodes that triggered a delta check alert involved an analytical testing problem. Furthermore, only 2 of 240 testing problems (0.8%) involved a postanalytical calculation or data entry error. Previous studies have shown that the majority (53%–75%) of testing errors identified by any method occur in the preanalytical stage of testing.27 However, the relative frequency of the remaining analytical (16%–31%) and postanalytical causes for testing errors (9%–30%) are much higher than those observed for the types of problems detected by delta checks.
The highest frequency of problems identified by delta check alerts were patient-related issues that were not related to testing. These included physiological changes or treatment (blood transfusion or dialysis), which may be due, in part, to the population (inpatients) for which delta checks are most applicable. Large physiological changes that are unrelated to the testing process could nevertheless affect interpretation of test results if the cause is not recognized.
Use of delta checks is considered an important method for identifying specimen mislabeling errors.28 In this study, misidentification errors were detected by delta check alerts in 12 of 4505 testing episodes (0.3%). This is within the range estimated from studies involving simulated specimen mix-ups in an inpatient population.15,18 Another study29 using label image recognition technology detected labeling errors in only 0.01% of cases in an outpatient population, for which delta checks are not beneficial. Detection of misidentified specimens is highly dependent on prevalence of mislabeling and number and type of delta checks in use, as well as extent of investigations. Nevertheless, labeling issues are a highly documented cause (10.1%–16.3%) of laboratory errors as measured by incident reports,30,31 and some may only be found by use of delta checks.
Outcomes and Quality Management
The most common outcome resulting from a delta check alert was issuing a standard report without modifications. The next more frequent outcome was to place an additional comment onto the test report even though no testing problem was identified. Although information about the content of these comments was not solicited from participants, providing specific information about the probable cause of a delta check alert (eg, blood transfusion, dialysis) may be helpful for interpreting results and should be considered a best practice.
Fewer but more clinically significant outcomes occurred when delta check alerts identified testing errors that led to specimen recollection and prevented reporting possibly incorrect results. Very few cases entailed issuing a corrective report, although a higher number would have likely occurred if previously tested specimens were more often evaluated. Of note is that a large proportion of delta check alerts persisted upon retesting. This could have been due to an unrecognized problem with the initial test that was not investigated, an unexplained preanalytical event, or an unidentified change in the patient's condition that affected test results.
Only 19 of 46 participants (41.3%) reported monitoring problems identified by delta check alerts as a quality management tool. Although various other quality systems may be used to track specimen collection errors, some problems, such as contamination by intravenous fluids, might be detected only by a delta check alert. Monitoring delta check alert trends and causes in laboratory quality assessment programs would be expected to strengthen performance improvement, especially for assessing problems involving preanalytical errors that occur in the inpatient setting. This study also showed lack of involvement by the laboratory director in selecting and approving changes in delta check parameters by 8 of 45 study participants (17.8%). This might be explained if delta checks are not viewed as part of the laboratory's quality management program or if oversight was delegated by the laboratory director who was then considered not involved. Laboratory directors should be engaged with how delta checks are being selected and used because of their impact on laboratory operations and quality.
Summary
In summary, several important and practical findings were revealed from this study that could improve the use of delta checks. First, using a larger number of different analytes for delta checks appears to have more value than using only a few. Results from this study may also assist laboratories in reviewing or revising analytes and criteria used for their delta check procedures. In addition, use of standardized and documented methods such as checklists for investigating delta check alerts that are customized by analyte for prioritizing actions is recommended. These improvements are expected to increase the number of problems detected while streamlining the evaluation process. Current practices should be changed to include examination of the previous specimen involved with a delta check for potential problems whenever possible. Because of the low rate of analytical errors detected by delta check alerts, the value of automatically retesting specimens, as commonly performed by study participants, should be reexamined, especially for chemistry tests, and discontinued if warranted. Finally, consideration should be given to monitoring causes and outcomes of delta check alerts as part of the laboratory's overall performance improvement program.
References
Author notes
The authors have no relevant financial interest in the products or companies described in this article.