Consolidation of clinical microbiology laboratory services has resulted in extended transit time for blood cultures from service points distant from the laboratory. Sepsis is critical; delays in identification of etiologic agents of diseases could adversely impact patient care.
To examine the effect of total preanalytic time and blood culture volume on the instrument time-to-detection for bacterial pathogens in blood cultures. A secondary objective was to obtain relevant blood culture information by questionnaire.
Participants in this Q-Probes study recorded date, time, and volume information for the first 50 positive blood cultures collected during the 12-week study period. Additional information regarding blood culture collection practices was obtained through questionnaire.
Prolonged overall time-to-detection was secondary to prolonged preanalytic time, particularly prolonged transit time, rather than slower organism growth once bottles were placed on the instrument. Among 1578 blood cultures, the overall time from collection to positive result was significantly less for blood cultures collected on-site than for off-site locations. Most institutions lack sufficient training programs and do not monitor preanalytic time metrics associated with blood cultures. Four hundred fifty-six of the 1580 blood cultures with blood volume adequacy reported (28.9%) were inadequately filled.
Overall process time (specimen collection to positive blood culture detection) is predicted to be higher for blood cultures collected off-site. Transit time is a variable that can be reduced to decrease overall time to detection. Thus, improved training and closer attention to preanalytic metrics associated with blood cultures could decrease hospital stays and mortality rates.
The incidence of sepsis and septic shock has increased in the United States, with more than 1.5 million people developing sepsis each year in the United States and about 250 000 people dying annually from the disease.1 Sepsis was listed as amongst the causes of death for 2 470 666 individuals with multiple contributing conditions (6% of all deaths) in the period from 1999–2014. During this period, the annual number of sepsis-related deaths increased 31% from 139 086 deaths in 1999 to 182 242 deaths in 2014.2 The incidence of septic shock in 2009 was estimated at 78 cases per 100 000 US adults, which was an increase from 12.6 cases per 100 000 in 1998.3 Septicemia was the sixth most common principal diagnosis for US hospitalizations in 2009, accounting for 836 000 hospital stays; an additional 829 500 hospitalizations identified septicemia as a secondary diagnosis.4 The morbidity and mortality from sepsis and septic shock has improved but remains high despite adoption of therapeutic guidelines provided by the Surviving Sepsis Campaign.5
In addition to the appropriate use of empiric antimicrobial therapy for patients at risk for sepsis, the timely and accurate detection of bacteremia remains critical to improving patient outcomes by identifying a specific bacterial pathogen and guiding antimicrobial therapy. Delays in delivering blood culture specimens to the laboratory have the potential to prolong the time to detection of bacterial pathogens by blood culture instruments, and subsequently the identification of the pathogen to the caregiver.6–8 Additionally, the insufficient acquisition of the proper volume of blood in blood culture bottles diminishes the sensitivity of the blood culture, and potentially could affect the time to pathogen detection by blood culture instruments. In turn, prolonged time to detection and subsequent identification of bacterial pathogens may delay appropriate antibiotic therapy, when the selection of empiric antimicrobial coverage was insufficient. Inappropriate antimicrobial selections adversely impact patient care, whereas the overuse of broad-spectrum antibiotics promotes antimicrobial resistance, adversely affecting antimicrobial stewardship efforts. Therefore, the study of factors that may adversely affect the performance of blood culture is warranted.
MATERIALS AND METHODS
Study Objectives
The primary purpose of this study was to examine the effect that total preanalytic time and blood culture volume have on instrument time-to-detection for bacterial pathogens in blood cultures. As secondary informative objectives, institution demographics and laboratory practices were examined for potential association between institution median instrument time-to-detection, aggregate instrument time-to-detection, and aggregate overall processing time-to-detection.
Data Collection
The first positive blood culture bottle from a set that yielded a significant bacterial pathogen from an individual pediatric or adult patient was included in this study. Subsequent blood culture bottles that signaled positive from a patient with prior positive blood culture bottles were excluded. Blood cultures that were considered contaminated and blood culture bottles that contained body fluids other than blood (eg, peritoneal fluid) were also excluded.
Participants in this Q-Probes study recorded the date, time, and volume information for the first 50 positive blood cultures collected during the 12-week study period. Data sources included collection time written on the bottles or uploaded on handheld devices with laboratory system connectivity, receipt in section records, instrument records, and documentation in the laboratory information systems. If an institution did not encounter 50 positive blood cultures during the study period, then retrospective data were allowed to be entered.
The blood culture time data collected included the following time points: (1) specimen collection time (ie, blood culture draw time); (2) the time of specimen arrival in the laboratory (ie, receipt time); (3) the time the specimen was loaded into the instrument; and (4) the time the blood culture bottle signaled positive (ie, detection by the instrument). Total preanalytic testing time is defined as the length of time between time points 1 and 3, instrument time-to-detection is defined as the length of time between time points 3 and 4, and overall processing time-to-detection is defined as the length of time between time points 1 and 4. Additional information collected for each blood culture specimen included patient type (adult or pediatric), collection site (on-site or off-site), blood culture bottle type (aerobic or anaerobic), blood culture volume adequacy (adequate, underfilled, or overfilled), and bacterial pathogen identification.
Participants were also asked questions about blood culture collection in their institutions in the general practices questionnaire. The inquiry included topics such as categorizing the medical staff groups who collect and process blood cultures, most common pathogens found in blood cultures, training methods to collect blood cultures, specimen transit processes, and locations of blood culture collections.
Statistical Analysis
Statistical analyses were performed to determine which factors are significantly associated with 2 institution-level blood culture processing turnaround time performance indicators: median overall processing time-to-detection and median instrument time-to-detection.
Preliminarily, statistically significant univariate associations between performance indicators and qualifying variables were identified. Qualifying variables, when analyzed individually, were found to be significantly associated with the instrument time-to-detection and included the percentage of samples collected off-site and the percentage of samples testing positive for 1 of the 2 most common pathogens identified in the study (Escherichia coli and Staphylococcus aureus). Significant associations between performance indicators with primary objective variables and practice characteristics were analyzed with general linear models to allow for the adjustment of these significant qualifying variables. Primary objective variables are classified as the following: median time from specimen collection to placement on instrument, median time from specimen collection to arrival in laboratory, median time from specimen arrival in the laboratory to placement on instrument, and percentage of blood cultures collected with adequate volume.
Blood culture processing turnaround times were pooled across institutions to assess objectives on an aggregate level. Outcome turnaround times (overall processing time-to-detection and instrument time-to-detection) were analyzed in aggregate to determine if they are distributed differently across blood culture collection characteristic variables. Aggregate outcome turnaround time measures were positively skewed; therefore, nonparametric Wilcoxon rank sum tests were used for independent variables with discrete groups.
The relationship between total preanalytic blood culture processing time with the instrument time-to-detection was assessed on an aggregate level by using a gamma-distributed generalized linear model with a log link function. This methodology was used to appropriately account for the skewed outcome variable of time from placement on instrument to positive blood culture detection, while allowing for the adjustment of collection site and the most common pathogen indicator. A natural log-transformation of total preanalytic blood culture processing time was applied to improve the model fit.
A significance level of .05 was used. All analyses were performed with SAS 9.3 (SAS Institute, Cary, North Carolina).
RESULTS
This Q-Probes study was designed to investigate the effects of preanalytic collection, transport, and processing times in relation to instrument time-to-detection for bacterial pathogens in blood cultures. The effect of blood volume adequacy on the instrument time-to-detection was also studied. Data for the study were collected by 36 institutions that reported on 1586 positive blood cultures. Most positive blood cultures, 1482 (93.9%), were from adults, whereas only 96 (6.1%) were from pediatric patients.
For the 36 institutions, the median time from blood culture collection to arrival in the laboratory (ie, collect-to-receipt time) was 28.3 minutes, whereas the time from arrival in the laboratory to placement on the instrument (laboratory processing time) was 15.5 minutes (Table 1). The median total preanalytic time (ie, specimen collection to placement on instrument) was 47.0 minutes. The median time from placement on the instrument to positive signal detection was 13.7 hours. The median of the overall process (ie, time from specimen collection to positive bacterial detection) was 16.4 hours (Table 1). The median of the performance measures was used to estimate performance at the institution level so as to ensure the institutions were compared equally, since the number of individual blood culture turnaround times reported per institution could vary.
Individual blood culture processing times were pooled across all institutions and studied in aggregate. The percentile distributions of these aggregate turnaround times from 1586 blood cultures are shown in Table 2. The median of the total preanalytic time was substantially longer in the aggregate data than the median institutional performance indicators (ie, 74 minutes versus 47 minutes).
The 3 median preanalytic time intervals among participating institutions were tested for association with the median time from placement on the instrument to blood culture positivity, while adjusting for qualifying variables. The results of these institution-level comparisons, which were not found to be statistically significant after adjusting for qualifying variables, are summarized in Table 3.
Associations Between Time to Positive Blood Culture Detection (h) With Preanalytic Processing Intervals

Differences in overall processing time-to-detection between positive blood cultures collected on-site and off-site were also studied. Positive blood cultures were from on-site locations for 1358 specimens (86.1%) and from off-site locations for 220 specimens (13.9%) of 1578 positive blood cultures. A significant difference between overall processing time-to-detection and blood culture collection location was detected, with the median overall processing time-to-detection being 22.7 hours for off-site locations and 16.1 hours for on-site locations (P < .001), a median difference of 6.6 hours (Table 4). Similarly, statistically significant differences were also found when the overall processing time-to-detection for on-site and off-site collected blood cultures was stratified by the 3 most common pathogens (Table 5). The distance of the off-site blood culture collections was reported for 218 specimens. The median distance reported was 10.0 miles, with 2.8 miles and 32.4 miles representing the 10th and 90th percentiles, respectively.
Comparison of Overall Time-to-Positivity for On-Site and Off-Site Blood Culture Collection Locations

Aggregate Associations Between Overall Processing Time and Collection Site Within Individual Pathogen-Type Subsets

The blood volume adequacy was determined for 1580 blood cultures. Of these, 1124 blood culture bottles (71.1%) were adequately filled, whereas 456 (28.9%) were inadequately filled (ie, either underfilled or overfilled). The bottle type (ie, aerobic versus anaerobic) was reported for 1571 positive blood cultures, of which 1035 (65.9%) were aerobic and 536 (34.1%) were anaerobic bottles. Of the 1583 blood cultures that were reported to contain a pathogen, E coli (379, 23.9%) was the most frequently isolated bacterium, followed closely by S aureus (346, 21.9%).
The percentile distributions of significant associations between aggregate time from specimen placement on instrument to positive blood culture detection was examined with respect to several blood culture collection characteristics after adjusting for qualifying variables. The median instrument-time-to-detection for E coli was 12.0 hours (n = 379), which was significantly less than for S aureus at 16.2 hours (n = 346) and all other pathogens (n = 858) at 15.8 hours (P < .001). There was a statistically significant difference between the instrument time-to-detection for blood volume adequacy (P = .04; Table 6). The 10th to 90th percentiles for the 451 inadequately filled blood culture bottles ranged from 8.9 to 36.4 hours, respectively, with a median time of 14.9 hours, compared to the shorter median time of 14.4 hours for 1119 adequately filled blood culture bottles with a narrower 10th to 90th percentile range of 9.0 to 29.5 hours, respectively (Table 6). Although there were far fewer positive anaerobic blood culture bottles reported than aerobic bottles, the median instrument time-to-detection for the anaerobic bottles was 13.0 hours (n = 536) compared to 15.1 hours (n = 1035) for the aerobic bottles (P < .001).
Aggregate Association Between Blood Culture Collection Characteristics and Time From Specimen Placement on Instrument to Positive Blood Culture Detection

There were also significant associations between institution median overall processing time-to-detection and institution demographics. The median overall processing time-to-detection, after adjusting for qualifying variables, was almost 2 hours less for nonteaching versus teaching hospitals (ie, 15.0 hours [n = 16] versus 16.9 hours [n = 16], respectively) (P = .01) and just over 3 hours less for rural versus urban hospitals (ie, 13.8 hours [n = 8] versus 16.9 hours [n = 25], respectively) (P = .03).
For the aggregate results, a significant association was demonstrated between the total preanalytic blood culture processing time and the time from placement on instrument to positive blood culture detection, after adjusting for collection site and most common pathogen indicators (P = .003). Results are summarized in Table 7 with preanalytic blood culture processing turnaround time intervals for comparison. The time from placement on instrument to positive blood culture detection is further stratified by collection site (Figure 1). Both Table 7 and Figure 1 use discrete intervals of 60-minute periods, up to 300 minutes, and greater than 300 minutes. Of the 202 blood cultures collected off-site that submitted incubation information, 190 (94.1%) were not incubated before delivery to the testing.
Association Between Time-to-Positive Blood Culture Detection With Preanalytic Blood Culture Processing Time (Reported by 60-min Interval Groups)

Instrument time-to-detection (hours) boxplots by total preanalytic time intervals (minutes) stratified by collection site (note: boxplot lower/upper fences represent the minimum and 90th percentile values).
Instrument time-to-detection (hours) boxplots by total preanalytic time intervals (minutes) stratified by collection site (note: boxplot lower/upper fences represent the minimum and 90th percentile values).
Figure 1 demonstrates the instrument time-to-detection (hours) boxplots by total preanalytic time intervals (minutes) stratified by collection site (Note: Boxplot lower/upper fences represent the minimum and 90th percentile values).
Institution Demographics and General Laboratory Practices Questionnaire Findings
The responses to the general laboratory practices questionnaires provide insights into processes and characteristics of the participating institutions. Responses to the general laboratory practices and demographics questionnaires are summarized here.
Most of the 31 participating institutions that reported demographic information had 300 occupied beds or fewer (64.5%, 20). Seventeen of 33 (51.5%) self-identified as teaching hospitals and 12 of 34 (35.3%) had a pathologist residency training program. Nineteen of 34 (55.9%) identified as having an urban, 7 (20.6%) as a suburban, and 8 (23.5%) as a rural location. Most laboratories (29 of 33, 87.9%) identified as nongovernmental facilities. Most of the participating laboratories had been inspected by the College of American Pathologists (32 of 34, 94.1%). However, only 3 participants of 34 (8.8%) had been inspected by The Joint Commission within the past 2 years.
The number of blood cultures performed at participating institutions during the study period varied widely. The median number of blood cultures collected during the study period out of 35 participating laboratories was 1731 (83 = 10th percentile; 4507 = 90th percentile). The median annual volume of blood cultures reported by 35 facilities in 2015 was 13 079 (3951 = 10th percentile; 43 271 = 90th percentile). The median number of positive blood cultures detected in 2015 for 34 institutions was 1079 (290 = 10th percentile; 4015 = 90th percentile). The annual blood culture positivity rates and contamination rates for 2015 were reported by 26 and 32 participating facilities, respectively. The median annual positivity rate for blood cultures with significant bacterial pathogens was 7.6% (4.3% = 10th percentile; 10.8% = 90th percentile). The median annual blood culture contamination rate was 2.0% (0.9% = 10th percentile; 3.3% = 90th percentile).
Blood culture collection was primarily performed in 23 of 35 institutions using laboratory resources (65.7%) (phlebotomy team, microbiologist, or technologist staff), and in 10 institutions using nursing resources (28.6%) (nursing staff or nursing assistants). Participants were asked to identify all sources in their facilities that are non–primary blood culture collectors (N = 32, multiple responses allowed). Nursing staff was identified as the most frequent nonprimary staff who collected blood cultures in 24 of 32 institutions (75.0%) and nursing assistants in 5 of 32 institutions (15.6%). These responses reflect that nursing staff provide a strong contributing support role in institutions as non–primary blood culture collectors.
Regarding training, laboratories provided written blood culture training procedures to all personnel who collect blood cultures in almost all facilities (34 of 35, 97.1%), with 21 facilities (60.0%) providing demonstrations on blood culture collection by laboratory staff. In 82.9% of the 35 facilities, the laboratory staff (phlebotomist or microbiology staff) provided clinical training. Clinical blood culture collection training for nurses or nursing aides was provided during orientation in 51.4% (18) of the facilities. Although 7 of 32 institutions (21.9%) reported that residents and physicians collect blood cultures, none of the institutions reported that blood culture specimen collection training was provided during resident/physician clinical orientation.
The number of blood culture sets initially allowed to be ordered varied amongst institutions; 11 of 35 participants (31.4%) reported that 2 blood culture sets may be initially ordered, 13 (37.1%) reported that up to 3 blood culture sets may be ordered, and 11 (31.4%) reported that any number of blood culture sets may be ordered.
Participants were also asked to identify locations from which blood cultures were received (N = 27, multiple responses allowed). Most institutions received blood cultures from on-site hospital units (96.3%, 26) and on-site emergency departments (92.6%, 25). Participants reported that off-site blood cultures were received from hospital units (40.7%, 11), emergency departments (40.7%, 11), clinics/urgent care centers (33.3%, 9), and 8 (29.6%) reported receiving blood cultures from nursing homes, cancer centers, oncology clients, off-site laboratory draw sites, outreach sites, provider offices, or reference laboratories.
Although most participants (32 of 34, 94.1%) reported having a sepsis protocol at their hospital, the vast majority of laboratories did not monitor transit time for blood cultures (31 of 33, 93.9%). Similarly, most (28 of 34, 82.4%) did not monitor the time from blood culture collection to incubation, the time from blood culture collection to time the blood culture bottle signaled positive (29 of 35, 82.9%), or the time from blood culture collection to bacterial identification (31 of 35, 88.6%). The types of bacterial pathogens were reported (N = 1583) and the frequency for the participants was determined (Figure 2).
DISCUSSION
There is a rise in patient morbidity and mortality associated with each hour that elapses before the cause of bacteremia or fungemia is determined.9 This study examined important preanalytic variables associated with blood culture practices at the institutional level and in aggregate. These variables included the time from blood culture collection to specimen arrival in the laboratory (ie, collect-to-receipt or transport time), specimen arrival in the laboratory to placement of the blood culture bottles onto the blood culture instrument (ie, accessioning and in-laboratory processing time), specimen placement on the instrument to positive blood culture time-to-detection (ie, instrument time-to-detection), overall blood culture processing time (ie, the time from blood culture collection to pathogen detection), and adequacy of blood volumes inoculated into blood culture bottles. Additionally, the accompanying questionnaire provided important insights into blood culture practices, as well as opportunities for improvement.
The Clinical and Laboratory Standards Institute (CLSI) recommends the processing of blood culture bottles within 2 hours of specimen collection.10 Several authors reported that delayed entry of blood culture bottles into the specimen incubator lengthens the overall time to pathogen detection or may result in microorganism loss.7,8,11 This Q-Probes study aimed to assess whether this effect would be detected among participating laboratories, which portion of the preanalytic phase of blood culture collection might be problematic, whether transport and processing may be responsible for delayed pathogen detection, whether blood culture volume adequacy is associated with instrument time-to-detection, and to provide participants with important benchmarking data to compare their performance.
This study found that most laboratories process blood cultures and place specimens on the instrument within 2 hours of collection (N = 36; median = 47 minutes; 75th percentile = 108 minutes). However, there was wide variation among laboratories for several of the preanalytic time intervals. When data were analyzed at the aggregate level, a noteworthy finding was that 26.6% (360 of 1354) of blood culture specimens collected on-site exceeded the 2-hour goal, and 7.4% (100) of the on-site collections exceeded 300 minutes (5 hours).
Only 2 of 33 participants (6%) responded that they routinely monitor transit times, suggesting that this is a potential quality assurance metric that is not commonly used. Patients may benefit if laboratories examine these important preanalytic variables to determine what barriers exist to timely delivery and processing of these important specimens. As expected, 89.0% (195 of 219) of the off-site collections exceeded the 2-hour transit time recommendation. The impact of extended transit time, with respect to the loss of bacterial viability or nondetection because of growth that occurred before placement on the instrument, was beyond the scope of this Q-Probes study; however, these deserve further study.
When combining data from all blood cultures for analysis at the aggregate level, a significant association was found between instrument time-to-detection and total preanalytic processing time (Table 7). For on-site collections, a relatively constant observed increase in median instrument time-to-detection continued until 240 minutes, thereafter the time-to-detection decreased (Figure 1). For off-site collections, the median instrument time-to-detection increased after the first 60 minutes, remained relatively constant through 300 minutes, then substantially decreased (Figure 1). These findings are in contrast to in vitro studies that have shown an inverse relationship between preanalytic time delays and instrument time-to-detection of bacterial pathogens.7 In this study, the shortest instrument time-to-detection was observed after the longer preanalytic time, which was both counterintuitive to our group, as well as in contrast to the previously cited studies. We hypothesize that the bacteria present in the blood culture bottles with the extended transit time may have already entered into or completed the lag phase of growth, and entered more quickly into the exponential phase of growth once placed on the instrument, which is consistent with what is known about bacterial growth kinetics.12 Although upon cursory review extended processing time may appear beneficial, it is actually a concern for several reasons. Foremost, the extended processing time will lead to an increased overall time to detection. Additionally, there is a concern that if significant bacterial growth occurs before incubation on the instrument, then the metabolic change that is detected by the instrument to produce a positive signal may have occurred while in transit and a positive blood culture bottle could be missed by the instrument. While an association with preanalytic time intervals and instrument time-to-detection was found, the current study was not designed to evaluate whether clinical outcomes were specifically impacted by delays in instrument detection of bacterial pathogens. When data were analyzed at the institutional level, no association was found between the median instrument time-to-detection and each of the following: the median time from specimen collection to placement on the instrument, the median time from specimen collection to arrival in the laboratory, or the median time from specimen arrival in the laboratory to placement on instrument (Table 3).
Although a prolonged instrument time-to-detection was not associated with prolonged preanalytic transit/processing time, the overall time from collection to-detection in such instances was increased proportionally to that of the increased time of transit. This was best demonstrated in Tables 4 and 5, which compare on-site and off-site collection locations. On-site collections were associated with a shorter overall blood culture time-to-detection (ie, specimen collection to pathogen detection) with a median of 16.1 hours (n = 1358) compared to the median off-site overall processing time of 22.7 hours (n = 220) (P < .001). This represented 6.6 hours of lost time reporting a positive blood culture, which is an important consideration in the era of rapid molecular diagnostics tests that afford tailoring antibiotic therapy. This is critical, as it has been estimated that every hour of treatment delay has been correlated to a 7.6% decreased chance of survival.9 This finding demonstrates that the most important variable for delayed overall time to detection was transit time, supportive of the CLSI recommendation of limiting the preanalytic processing time of blood cultures to 2 hours.10 Importantly, this variable can be addressed through process improvement interventions to speed the recovery of pathogenic bacteria in patients with bacteremia.
It is well recognized that inadequately filled blood culture bottles affect the sensitivity of recovering bacterial pathogens.10,13 This is because most bacteremia is intermittent bacteremia, and blood cultures represent a sampling of the entire blood volume; ergo, the greater the volume sampled, the more likely that the bacteremia will be detected. This Q-Probes study also compared the adequacy of blood volumes to instrument time-to-detection for bacterial pathogens. Of the 1580 blood cultures with blood volume adequacy reported, 1124 blood culture bottles (71.1%) were adequately filled, 419 (26.5%) were underfilled, and 37 (2.3%) were overfilled. When the instrument time-to-detection for adequate blood volumes was compared to inadequate blood volumes (underfilled and overfilled combined), the 1119 adequate blood volumes had a significantly shorter instrument-time-to-detection with a median of 14.4 hours (90th percentile = 29.5 hours) compared to a median of 14.9 hours (90th percentile = 36.4 hours) for the 451 inadequate blood volumes (P = .04; Table 6). Most (30, 88.2%) of the 34 participating laboratories recognized the importance of adequate blood volumes by indicating that they currently monitor this preanalytic variable. The results of this current study further support the importance of adequate blood volumes for timely detection of bacterial pathogens.
This study collected data on specific bacterial pathogens recovered from positive blood cultures. Twenty-seven different bacterial pathogens were reported from the 1583 blood cultures studied (Figure 2). Escherichia coli was the most common pathogen, detected in 379 (23.9%) of the positive cultures, followed by Staphylococcus aureus in 346 (21.9%) of the positive cultures. Klebsiella pneumoniae was the third most common isolate, present in 122 (7.7%) of the positive cultures. Aggregate blood culture instrument time-to-detection was compared for the 2 most common pathogens detected. Escherichia coli was associated with the shortest median instrument time-to-detection at 12.0 hours, whereas S aureus had a median instrument time-to-detection of 16.2 hours. The blood culture instrument time-to-detection differences for these 2 organisms may reflect the differences in their growth kinetics, but importantly, standard blood cultures detected the median quantity of these most frequent pathogens in less than 24 hours.12
This Q-Probes study supports the importance of the timely processing of blood cultures and inoculating adequate blood volumes into blood culture bottles. Patients may benefit from closer analysis and monitoring of these important preanalytic variables combined with interventions designed to reduce preanalytic transit/processing times and promote the adequate filling of blood culture bottles.
References
Author notes
The authors have no relevant financial interest in the products or companies described in this article.