Abstract
Objectives.—To describe longitudinal trends in the efficiency, labor productivity, and utilization of clinical laboratories in the United States.
Methods.—Financial and activity data were prospectively collected from 73 clinical laboratories continuously enrolled in the College of American Pathologists Laboratory Management Index Program from 1994 through 1999. Each laboratory reported quarterly on its costs, labor inputs, and test activity using uniform data definitions.
Results.—During the 6-year study period, there was a significant increase in laboratory labor productivity (2.1% more tests/full-time equivalent/y; P < .001). Productivity increases were offset by increasing labor expense (1.5%/full-time equivalent/y; P < .001), consumable expense (1.7%/on-site test/y; P = .005), and blood expense, which comprised more than 10% of laboratory expenses by 1999 (4.4% increase/y; P < .001). As a result, overall expense per test showed no significant change in non–inflation-adjusted dollars. Reference laboratory expense per test did not change significantly during the study period; the proportion of tests sent to reference laboratories grew slightly (0.06% increase/y; P < .001). Test volume of the median laboratory grew by 5442 tests per year (2.3% annual increase; P < .001), while the proportion of testing from inpatients declined by 1.7% per year (P < .001). Inpatient test utilization declined on a discharge basis (annual decline of 1.2 tests/inpatient discharge; P < .001) and on a per diem basis (annual decline of 0.08 tests/inpatient day; P = .002). Inpatient laboratory expense declined on a discharge basis (annual decline of $2.40 or 1.3% per discharge; P < .001), but did not change significantly per inpatient day. Most of the reduction in the expense per discharge occurred during 1994–1996.
Conclusions.—Between 1994 and 1999, clinical laboratories in the United States experienced significant changes in the cost of operations, utilization, and labor productivity. Laboratory administrators who compare local institutional performance with that of peers are advised to use current or forward-trended peer data. Quarter-to-quarter improvement in many measures of laboratory financial activity may not signal a superior operation, as performance of the whole industry appears to be improving.
Financial management of clinical laboratory operations requires accurate and timely information about laboratory test activity, staffing, and expense. Most commonly, laboratory financial performance is compared with a budget or with past performance. Many laboratory managers also compare their laboratory's performance with that of other institutions.
Both internal trending and external comparisons are informed by an understanding of how the clinical laboratory industry is changing. To the extent that productivity of the laboratory industry is improving, laboratory managers should not be satisfied with maintaining constant performance in their own operations. Also, external comparisons using dated peer data may be misleading, potentially creating a false sense that an institution's productivity is better in comparison to peers than is actually the case.
Little information about secular trends in the expense, productivity, and utilization of clinical laboratories has been published. Some relevant knowledge is possessed by consultants and business advisors who have worked with individual laboratories to measure and improve productivity, but accumulated information is often considered proprietary and not presented in peer-reviewed publications. Furthermore, the set of clinical laboratories that have engaged consultants may not be representative of the industry as a whole.
Most of the published literature relating to trends in laboratory productivity concerns individual institutions. One consolidated laboratory reported on a 5-year period in which outpatient activity grew by 10% per year, inpatient by 2% per year, and cost per test declined by 1.6% per year.1 A consultancy reported that the average cost per test for hospitals with fewer than 300 beds rose by 1% per year, while the increase was 3% per year at larger institutions.2 Another consultancy reported that outpatient test activity from physician offices increased at an annual rate of 10%, while volume from acute care hospital laboratories rose by 4% to 7% per year.3 Three fourths of 28 surveyed laboratories in Minneapolis-St Paul, Minn, reported that 1995 workload had increased over the previous 5 years.4
This investigation was designed to describe trends in the expense, productivity, and utilization of a large group of clinical laboratories in the United States. Data were derived from the College of American Pathologists (CAP) Laboratory Management Index Program (LMIP), which is a financial benchmarking tool that permits laboratory managers and directors to track the financial performance of their facilities over time and to compare their performance with that of appropriate peer institutions. The LMIP is a voluntary subscription service, and facilities have joined and departed from the program since its inception. Because the addition or departure of laboratories could obscure secular trends in laboratory efficiency and utilization, we chose to analyze data from the subset of LMIP institutions that participated continuously in the program from the first quarter of 1994 through the fourth quarter of 1999. Seventy-three laboratories met this criterion.
METHODS
Study Population and Data Collection
Data were collected using a previously described format.5 Briefly, participants in the LMIP program annually completed a questionnaire describing their institution and the patient population they served. On a quarterly basis, participants reported their clinical laboratory costs, labor inputs, test activity, and hospital discharge and census activity. Enrollment in the LMIP program is voluntary; participants initially commit to the program for 1 year and reenroll the following year if they believe the program continues to serve a useful purpose. Furthermore, some participants who enroll in the program do not submit data every quarter. From the LMIP subscriber base, 73 participants were identified who submitted quarterly data continuously from the first quarter of 1994 through the fourth quarter of 1999. Data from these laboratories formed the basis of this investigation. The identity of individual laboratories was not revealed to the outside study investigators (P.N.V. and R.B.L.), in conformance with CAP policies regarding data confidentiality.
Definitions
To ensure comparability of participant responses, LMIP established standard definitions and procedures for recording labor inputs (paid and worked hours), nonlabor expenses (consumables, blood, reference laboratory costs, equipment maintenance, and depreciation), hospital activity (days and discharges), and test counts (LMIP Standardized Billable Test). Definitions of LMIP terms are cited elsewhere5 and were used throughout this study. The term test in this article corresponds to the LMIP Standardized Billable Test. All definitions remained constant throughout the study period, with the exception of the methodology for counting tests. Prior to 1997, billable test counts were used for counting tests. Beginning in 1997, tests were counted using the LMIP Standardized Billable Test definition. The principal characteristic of the LMIP Standardized Billable Test is that chemistry profile tests are “unbundled” and their reportable elements tabulated individually. Billable test counts from 1994 through 1996 were converted to the LMIP Standardized Billable Test counts prior to analysis, using a conversion formula developed by the CAP and validated for each study laboratory. To ensure that the conversion of 1994–1996 billable test counts to LMIP Standardized Billable Tests did not bias results, the statistical analyses reported in this study were repeated on a second set of 129 LMIP participants who continuously submitted quarterly data from 1997 through 1999, a period in which the LMIP Standardized Billable Test was used exclusively. All statistically significant changes involving test counts that were identified in the 1994–1999 study population were also identified in the second data set.
The Federal Bureau of Labor Statistics US Medical Care Services Series CUUR0000SAM2 was used to estimate the rate of medical services inflation, while the all-item consumer price index for urban consumers (series CUUR0000SA0) was used to estimate general inflation. During the 6-year study period, medical services inflation averaged 4.62% per year, while general inflation averaged 2.57% per year. When trends in inflation-adjusted dollars were calculated, nominal expense was divided by the inflation series coefficient for the middle month of the quarter before regression analysis.
Statistical Analysis
Participants' quarterly data submissions were prospectively subjected to several internal validity checks that are incorporated into the LMIP program. These checks included rejection of submitted activity counts that did not add up to separately submitted totals and flagging input values that deviated more than 25% from the previous quarter; these data were either designated as out of control compared to previously submitted data or were identified as unusual using a proprietary multivariate validity check. Participants were required to resubmit rejected data and were instructed to reexamine flagged data and submit a corrected value if errors were identified.
Quarterly medians for selected activity, expense, utilization, and productivity metrics were calculated and fitted with a simple linear regression model using time (quarter) as the predictor variable. When a change in performance was expressed as a percentage, the average quarterly performance of the median laboratory during 1999 was used in the denominator.
Several LMIP participants did not answer all of the complexity questions designed to characterize their laboratory and the patient population the laboratory served. These institutions were excluded only from tabulations that required the missing data element. χ2 (categorical variables) and Kruskal-Wallis χ2 (continuous variables) were used to compare LMIP study institutions with other LMIP participants.
RESULTS
Participant Characteristics
Characteristics of the 73 study institutions are shown in Table 1. Eighty-two percent were nonteaching and 86% were nongovernmental. The median institution processed 953 023 tests per year and had 10 128 discharges. Compared to other LMIP participants, study institutions were significantly larger, had more discharges, and had a higher LMIP complexity index, which is a rating system used to measure the presence of complex hospital or laboratory services that are associated with increased laboratory cost.
Since institutions enter and leave the LMIP program, the rank order of the median study laboratory was compared with that of other LMIP subscribers to determine how study participants compared to other LMIP subscribers. The median study laboratory's labor productivity (tests/full-time equivalent [FTE]) was in the 56th percentile of all LMIP subscribers in the first quarter of 1994 (slightly more productive than other subscribers), but dropped to the 36th percentile by the fourth quarter of 1999. The median study laboratory's on-site cost per on-site test was in the 43rd percentile in the first quarter of 1994 (less expensive than other subscribers), but rose to the 63rd percentile in the fourth quarter of 1999. Both changes in rank order were statistically significant (P < .001).
Laboratory Activity
During the 6-year study period, the test volume of the median laboratory grew by 21 768 tests per year (P < .001), while the proportion of testing from inpatients declined by 1.7% per year (P < .001; Table 2 and Figure 1).
Overall Laboratory Productivity and Expense
Overall laboratory productivity (expense/test) of the median laboratory showed no change in non–inflation-adjusted dollars. After adjustment for medical services inflation, expense per test declined by $0.64 per test per year or 3.97% per year (P < .001; Table 2 and Figure 2).
Comparable trends were seen for on-site tests. On-site laboratory expense per on-site test showed no change in non–inflation-adjusted dollars. After adjustment for trends in medical services inflation, on-site expense per on-site test declined by $0.56 per test per year or 3.47% per year (P < .001).
Labor Productivity and Expense
During the 6-year study period, labor productivity showed a statistically significant annual increase of 2.06% (330 more on-site tests/FTE/y; P < .001). Labor productivity increases were partially offset by increases in the median laboratory's average wage. The average wage increased 1.68% per year ($664/employee/y; P < .001). The increasing average wage reflects a combination of wage inflation and change in mix of laboratory employees with different wages (Table 2 and Figure 3).
Consumable and Blood Expense
Referred Testing
Reference laboratory expense per test did not change significantly during the study period (Table 2). The proportion of tests sent to reference institutions increased during the study period (Figure 5), but the median facility still sent less than 2% of its test activity to reference facilities at the conclusion of the study.
Inpatient Utilization and Expense
Inpatient test activity declined on a discharge basis (annual decline of 4.60 tests/inpatient discharge/y; P < .001) and on a per diem basis (annual decline of 3.36 tests/inpatient day; P = .002; Table 2 and Figure 6). Inpatient laboratory expense declined on a discharge basis (annual decline of $9.60/discharge; P < .005), but did not change significantly per inpatient day (Figure 7). Most of the decline in inpatient expense/discharge occurred during the first 3 years of the study period. In an analysis of 129 participants who were continuously enrolled in LMIP between 1997 and 1999, no significant change in cost per discharge was observed (P = .54).
COMMENT
Workload, expense, and productivity trends reported for the laboratory industry have varied widely, creating demand for a comprehensive summary of industry trends using a set of uniform data definitions. To our knowledge, this is the first national, multi-institutional, longitudinal study of clinical laboratory productivity and utilization reported in the peer-reviewed literature during the past decade.
The principal finding of this study was a substantial and statistically significant increase in labor productivity observed during the 6-year study period. Because LMIP defines staff (FTE) on an hourly basis (1 FTE = 2080 paid h/y), the increase in labor productivity can be attributed to increased throughput per hour employed, and not to staff working longer hours. We were unable to determine whether the increased staff throughput per hour was accomplished as a result of improved automation, or whether laboratory staff were expending more effort each hour they worked.
Labor productivity increases were offset by increases in the average laboratory wage, increasing cost of consumables, and increasing cost of blood and blood products. Overall, the cost per on-site test, which represents the cumulative expense attributable to labor, consumables, blood, and equipment depreciation, remained constant in nominal dollars during the study period. When expressed in medical services inflation-adjusted dollars, cost per on-site test declined 3.5% per year. These efficiency gains are somewhat higher than those reported elsewhere,1,6 but are lower than one reported figure from 19957 and a second from 1996.8
Although the average wage increased during the study period at a rate of 1.7% per year, the rate of increase was less than inflation for medical care services during the same period, and was also less than increases in the consumer price index. This may indicate that laboratory wages were not keeping pace with inflation, that laboratories shifted their staffing mix toward workers with lower average wages, or both. We were unable to distinguish between these possibilities with the data available. Laboratories with a lower proportion of medical technologists have reported a lower average wage that is only partially offset by lower labor productivity.9
Overall, hospital-based laboratories experienced increased test activity during the study period, coupled with a pronounced shift away from inpatient work and toward outpatient activity. This trend has been widely reported.
The use of reference laboratories increased slightly during the study period. However, even during the last year of the study the median laboratory reported sending fewer than 2% of tests to off-site facilities for analysis. The reference laboratory expense per test (expressed in nominal dollars) did not change significantly during the study, a finding that parallels our observation of constant on-site expense per test.
Inpatient test utilization declined significantly on a discharge basis and on a per diem basis, reversing an upward trend that had been reported previously in studies from individual institutions.6 Since the acuity of inpatients during the study period is generally considered to have increased, the decrease in laboratory inpatient utilization reflects more parsimonious use of the laboratory by clinicians. This trend resulted in significantly decreased laboratory expense on a discharge basis (1.3% decrease per year), although there was no significant change in inpatient laboratory cost per inpatient day. When correcting for inflation, the decrease in inpatient expense per discharge was even more pronounced. Most of the reduction in expense/discharge occurred during the first 3 years of the study period (1994–1996); there was no significant decrease observed in the last 3 years (1997–1999). We cannot explain this finding.
Five limitations of this study must be borne in mind. First, we do not know how well institutions included in the study represented the general population of hospital-based clinical laboratories. A variety of innovative laboratory configurations are emerging that may not be adequately represented in the study group of 73 facilities.7,10 Participation in the CAP LMIP is voluntary, and institutions that elected to participate in the program may not have been typical of other clinical laboratories. Furthermore, there is some evidence to suggest that institutions that participated continuously in LMIP for 6 years differed from the remainder of the LMIP laboratory population. Study institutions handled significantly larger test volumes, experienced more annual discharges, and were of higher complexity than LMIP participants who did not continuously submit quarterly data during the 6-year study period. Moreover, other participants in the LMIP program at the beginning of the study period tended to have somewhat lower labor productivity and a higher cost per test than the 73 study laboratories, whereas at the end of the study period other LMIP participants had a slightly higher labor productivity and a lower cost per test than the study institutions. These observations raise the possibility that study laboratories did not improve their overall costs and labor productivity at the same pace as other institutions. If this is the case, then the data in this report may understate the accomplishments of the industry as a whole.
Second, it is important to emphasize that this study focused on secular trends in performance and not on the absolute levels of performance that are being achieved by representative hospital-based clinical laboratories. Extensive work by the CAP5 and others9 has shown that laboratory efficiency is significantly impacted by unmanageable variables that are outside the control of laboratory managers, such as the presence or absence of teaching programs in the hospital or differences in inpatient case mix that are reflected in the need for specialized, expensive laboratory services. Interlaboratory comparisons should be restricted to peer groups composed of laboratories that share similar unmanageable characteristics. The CAP's LMIP uses a tree-based modeling approach to identify significant unmanageable characteristics and a proprietary grouping methodology to select suitable laboratories for comparison. The data presented here speak to industry trends and not to absolute levels of performance that individual institutions should strive to achieve.
Third, this investigation was restricted to an analysis of cost elements under the control of laboratory managers. These cost elements included laboratory staffing and wages, consumable expense, the cost of blood and blood products, equipment depreciation and maintenance, and reference laboratory expenses. Many indirect costs, including allocated facility expense, the cost of a laboratory information system, and human resources and billing expenses were not included in this study.
Fourth, it must be emphasized that results reported here are not comparable to results obtained from other sources that do not use LMIP data definitions for test counts, expense, and labor. Of particular importance is the LMIP practice of unbundling chemistry profiles to arrive at stable test counts (the LMIP Standardized Billable Test). Labor productivity ratios (tests/FTE) that rely on bundled test counts will be significantly lower, owing to lower test counts in the numerator. Similarly, expense ratios (cost/test) will be significantly higher when bundled test counts are used.
Finally, the trends reported here should not be used by laboratory providers or payers to set fees or payment levels. Laboratory providers have remarked that laboratory fees have not kept pace with the consumer price index, while payers have argued that productivity increases in the industry should lead to lower fees. This study was restricted to an analysis of laboratory-manageable costs, which comprise a fraction (typically around 50%) of the fully absorbed expenses of a hospital-based clinical laboratory. Without data regarding trends in indirect costs outside the control of most laboratory managers, we cannot make any recommendations about future fees or payment levels.
Despite these study limitations, our data demonstrate that clinical laboratories in the United States have experienced significant changes in the cost of operations, utilization, and labor productivity. Laboratory managers wishing to compare local institutional performance with peer norms should eschew dated data sets derived from old surveys or consulting engagements. Where current data are not available, dated peer data should be forward-trended, although in some areas (such as laboratory cost per discharge) historic trends may no longer be operative. Quarter-to-quarter improvement in many measures of laboratory financial activity may not signal a superior operation, as performance of the whole industry appears to be improving.
References
Author notes
Reprints: Paul N. Valenstein, MD, Department of Pathology, St Joseph Mercy Hospital, Ann Arbor, MI 48106-0995 ([email protected]).