Application of organic amendments to agricultural land improves soil quality and provides nutrients essential for plant growth; however, they are also a reservoir for zoonotic pathogens whose presence poses a significant risk to public health. The persistence of bacteria in manure-amended soil, and differences in manure handling practices, are important issues from a food safety perspective. The primary objective of this study was to quantitatively summarize the variations in the rate of decline of Escherichia coli and Salmonella spp. in manure-amended soil under laboratory and field conditions, and to assess the impact of environmental factors. Available literature data on persistence of E. coli and Salmonella spp. in manure-amended soil from 42 primary research studies were extracted and statistically analyzed using a mixed-effect regression model. The results indicated that temperature (soil and air combined) was the most prominent factor affecting persistence of both E. coli and Salmonella spp. under laboratory conditions (P < 0.001), and of E. coli under field conditions (P < 0.05). The time required for a log reduction of E. coli under field conditions was significantly higher at low temperature (0 to 10°C) than at high temperature (greater than 20°C) (P < 0.05). In addition, application method was identified as a significant factor, with manure incorporation to soil inducing longer survival compared with surface application by approximately 1.2 times. The significant variation observed among primary research studies of bacterial persistence has highlighted that mitigation strategies associated with the use of manures in fresh produce production need to be improved by addressing factors such as climate, soil management, application method, and initial microbial levels. These findings may be used to support guidelines establishing exclusion periods between manure fertilization and the grazing or harvesting of crops, and may be useful for the generation of quantitative microbial risk models for fresh produce.
A data set of 418 inactivation curves from 42 studies was compiled and statistically analyzed.
Temperature was the most significant factor affecting decline rates.
Laboratory trials exhibited longer survival times of E. coli compared with field trials.
No significant difference was observed between median decline rates of E. coli and Salmonella spp.
Organic amendments including raw manure and compost are commonly applied to agricultural land to prevent soil erosion, replenish nutrients within the soil, and maintain soil quality for repeated land usage (14, 34, 39, 40, 53). Application of manure or compost to soil is a common agricultural practice; however, there is no universal agreement regarding manure handling practice in fresh produce production to minimize the risk of foodborne illnesses (Table 1). Because it is difficult to eliminate once contaminated, avoiding the use of raw manures should be the first course of action. However, in cases where this is not possible, employment of exclusion time periods between raw manure application and produce harvest have been put in place as a prevention management strategy to reduce the risk of microbial contamination of fresh produce (17, 22, 35); but significant variation across these guidelines is observed (Table 1). Although Freshcare guidelines propose that a 90-day exclusion period for high-risk produce groups (i.e., edible parts in direct contact with soil or consumed uncooked) would be sufficient to mitigate biological hazards associated with any risk coming from manure-amended soil, the California Leafy Greens Marketing Association (8) and the Fresh Salad Producers Group–Produce Market Association Australia–New Zealand (20) prohibit usage of untreated manure and require a 1-year exclusion period prior to production of leafy greens if these materials have been applied to a field. The proposed exclusion period may vary not only among regions but also among produce groups termed low or high risk (45 to 120 days) (Table 1). Considering that management of microbial risk hazards in fresh produce begins with compliance with guidelines, these variations create significant challenges, especially for growers. Therefore, there is an urgent need for feasible, science-based, data-driven exclusion periods to be set by food safety certification bodies in fresh produce production.
Accurate predictions that account for survival of foodborne illness bacteria in manure-amended soils are crucial in the development of risk-based guidelines (3, 26, 43). Organic amendments are a potential reservoir for zoonotic pathogens (29); and, if they are not handled appropriately, they pose a significant risk to public health. Whereas there are many potential routes for pathogen contamination of fresh produce that may lead to foodborne illnesses, contamination through the application of untreated manure or compost to soil is one of the most important during primary production (54). Enteric pathogens, such as Salmonella spp. and Escherichia coli O157: H7, are leading causes of foodborne illness outbreaks from consumption of contaminated fresh produce (11, 35), and they have been shown to present in manure (3), soil (21), and manure-amended soil (10). For example, aged cattle manure and raw manure were identified as the possible source of contamination in foodborne outbreaks of E. coli O157:H7 associated with consumption of leaf lettuce (1) and mesclun lettuce (25).
Variability in persistence of bacteria in manure-amended soil and differences in manure handling practices are important issues from a public health point of view. Whereas predictions of pathogen persistence in manure-amended soil under specific conditions have been independently investigated in primary research, many of these results are complicated by the interactions of multiple factors in natural environments (18). An alternative approach to assess pathogen persistence in situ may be achieved by combining studies utilizing the full set of biotic factors (genus, serovar, and soil microbial communities) and abiotic factors (temperature, moisture, soil types, manure types, manure state, and soil management) through a meta-analysis.
Although the effects of different variables on survival of E. coli and Salmonella spp. in manure-amended soils have been the subject of a number of primary studies (10, 45, 46, 53), there is limited information about interactions among the various environmental factors on the persistence of E. coli (17). In the current literature, although some studies have statistically compared the impacts of multiple environmental factors on the persistence in manure-amended soil, to the authors' knowledge no such meta-analysis has been conducted for both E. coli and Salmonella spp. Thus, the objectives of this meta-analysis were to (i) quantitatively summarize the persistence of E. coli and Salmonella spp. in manure-amended soils, (ii) identify the environmental drivers that influenced persistence in manure-amended soils, and (iii) compare the decline rates between E. coli and Salmonella spp. in manure-amended soil under both laboratory and field conditions. Data generated from this study will contribute to enhanced understanding of experimental and ecological factors in pathogen persistence. This information may aid in the development of practical management to reduce the likelihood of microbial contamination on crops from manure-amended soil. In addition, the results of the meta-analysis could be used in the development of probabilistic quantitative microbial risk assessments for preharvest contamination in manure-amended soils.
MATERIALS AND METHODS
To establish a list of research studies reporting decline rates for E. coli and Salmonella in manure-amended soil, a literature search was performed using the scientific database Web of Science (Fig. 1).
The search was restricted to studies published in English only. The first abstract-based screening identified primary research that reported on the persistence of E. coli and Salmonella spp. in manure-amended soil. Reviews and meta-analyses were excluded to avoid data replication. The matrices of focus were soils amended with raw manures or composts only. Data from studies in manure, cattle, water, and produce were excluded. The effect of the ambient temperature, application methods, manure types, manure states, soil types, moisture, pathogen serovar, and sampling depth within soil layers on decline rates of pathogens were included. The effect of alum, stress response systems, antibiotic-resistance genes, simulated treatment, and contaminated irrigation water were also excluded. In the second screening, which reviewed whole documents, studies that reported qualitative data such as presence or absence of pathogens were excluded. Studies that met the selection criteria were subjected to data extraction and statistical analysis.
To compare data among multiple experiments, D-values (the numbers of days required for 1-log reduction in bacterial numbers per g of amended soil) (29) were taken directly from author calculations provided in primary papers or were calculated from the first-order decline of the inactivation curves based on a linear approximation.
Comparison among studies was complicated by significant variation among bacteria inactivation curves. Simplified linear approximations based on a log-linear decline, excluding tailing effects and consecutive increasing data points, was selected as the most appropriate method to derive decline rates for subsequent analysis. Previous comparisons among inactivation models for quantitative risk assessment recommended that order of magnitude for inactivation could be efficiently derived by using a simple model to first focus on the main determinants of risk (50). Disregard of shoulders and tails has been shown to maintain the same qualitative conclusion on the relevance of inactivation for risk, whereas tailing-off phenomenon could dramatically increase the decimal reduction time (D-value) due to the prolonged survival times (9, 50).
D is the decimal reduction time (days), t is the exposure time (days), Nmax is the highest initial microbial concentration (CFU g−1), and Nt is the microbial concentration accounting for 2-log or 3-log reduction (CFU g−1). In most studies, Nmax was considered as the first measurement. However, in some decline curves, the initial bacterial proliferation increased, so the highest reported count was taken as Nmax. Consecutive increasing data points were excluded from analysis.
Temperature values extraction
Continuous temperature values over time were categorized into the multilevel categories as follows: “low” for less than 10°C, “medium” for 10 to 20°C, and “high” for higher than 20°C. Daily average temperatures (°C) were calculated from the minimum nighttime temperature and maximum daytime temperature within the periods of the field trials when 2- to 3-log reduction occurred.
Moisture values extraction
Numerical values of moisture conditions were also categorized into two qualitative categories. Moisture conditions with soil moisture content less than 20%, or rainfall less than 10 mm, or relative humidity (RH) less than 30% were categorized as “low moisture”; moisture conditions with soil moisture content higher than 20%, or rainfall higher than 10 mm, or RH higher than 30% were categorized as “high moisture” (38, 39). Average rainfall (mm) was calculated by dividing the total rainfall by the number of days within the period when the reduction happened. Average RH (%) values were calculated based on provided minimum and maximum values. Level of soil moisture content (%), rainfall (mm), and relative humidity (%) were embedded into a single moisture condition factor.
A linear mixed-effect regression model in statistical package lme4 (4) in R was used to determine the most significant factors affecting the D-values of enteric bacteria in manure-amended soil under laboratory and field conditions. Due to the unbalanced structure of the data set (many more data on laboratory than on field conditions, many more data on medium temperature conditions than on low and high temperature conditions), the analysis was performed with mixed-effect model, a hierarchical model that incorporated both random-effect variables and fixed-effect variables (33). Random-effect variables were included in the model to account for the heterogeneity of studies through a statistical parameter representing the interstudy variation (33). The D-values were log-transformed to achieve normality assumptions of statistical analysis and were treated as the dependent variable in the mixed-effect linear regression model. All other fixed- and random-effect independent variables included in the analysis are described in Table 2.
The best regression models, which optimally balanced model fit and model complexity, were selected using stepwise model selection by Akaike's information criterion. Normality distribution and constant variance of the errors was validated by visual inspection of plots of residuals (no obvious trend) and residuals versus fitted values (residuals scattered randomly around zero). To evaluate the statistically significant differences of fixed-effects in the models, the P values for each effect were generated by the analysis of variance (ANOVA) function in the car (55) package using a Wald chi-square test. Tukey's posthoc test was used to confirm the differences among categorical groups of each fixed-effect factor.
The Wilcoxon rank sum test was used to compare the decline rates of E. coli and Salmonella spp. between laboratory and field-based studies. Estimation for the relative population decline (log CFU g−1) after exclusion periods (days) was based on the median of D-values directly obtained from primary research papers. The D-values were log-transformed to achieve normality assumptions of statistical analysis and were treated as the dependent variable in the mixed-effect linear regression model.
The primary search identified 87 papers. After the first abstract-based screening, 38 papers were excluded from further analysis because (i) these papers were reviews, meta-analyses, and replicated papers (n = 13), (ii) the focus was on specific effects of alum, stress response system, antibiotic-resistance genes, and contaminated irrigation water (n =13), (iii) the focus was on pathogen persistence in manure, animals, drainage water or leachate, and produce (n = 6), or (iv) artificial treatments were used to enhance reduction (n = 2). Whole-manuscript screening excluded eight papers that (v) did not focus on Salmonella and E. coli (n = 7) and (vi) only provided presence or absence or molecular data (n = 5). The final data set of appropriate papers contained 42 studies (Supplemental Table S1), including 8 laboratory and 9 field-based studies of Salmonella spp., and 17 laboratory and 22 field-based studies of E. coli (Tables S2 through S5).
Analysis of deviance outputs
Based on stepwise model selection by Akaike's information criterion, best regression models, which achieved an optimal balance between model fit and model complexity, included only temperature conditions and application methods that caused significant impacts on decimal reduction time of pathogens as fixed-effect variables. Table 3 presents the P values of each factor resulting from the ANOVA tests.
Influence of low, medium, and high temperatures on pathogen behavior in soils
The influence of temperature under laboratory and field conditions was determined by analyzing the effect of temperature conditions (low, medium, high) on derived D-values.
Temperature was the most crucial factor in the fate of E. coli and Salmonella spp. under controlled temperature conditions, including laboratories or greenhouses (P < 0.001) (Table 3).
Figures 2 and 3 show the time (log days) required for 1-log reduction of bacteria. The log reduction of both E. coli and Salmonella spp. significantly (P < 0.001) declined 1.4 times more slowly at low temperatures than at high temperatures. Both declined relatively more slowly at low temperatures than at medium temperatures, and more slowly at medium temperature than high temperature, although there was no significant difference between them (P > 0.05) (Figs. 2 and 3).
Among all the investigated factors, temperature was the most significant factor on the decimal reduction time of E. coli under field conditions (P < 0.001) (Table 3). The order of magnitude was in agreement with those in laboratory studies, and the time required for a log inactivation also decreased from low to medium and high (P < 0.05). The time required for a log reduction of E. coli under field conditions was significantly higher at low temperature than at high temperature (P < 0.05) (Fig. 4).
Influence of application methods on D-values
The influence of application methods on D-values under laboratory and field conditions was assessed by modeling the effect of surface-spread and incorporation methods.
The results of the correlational analysis indicated that application was an important factor influencing E. coli survival in laboratory-based experiments (P < 0.05) (Table 3). Length of time (days) required for 1-log reduction of the microorganism was 1.2 times higher when the matrix was incorporated into the soils (P < 0.05) (Fig. 3).
Application method was a prominent factor (P < 0.05) that influenced the survival of Salmonella spp. in the field trials. The orders of magnitude were identical to those of E. coli in laboratory studies: incorporation significantly supported 1.2 times longer time requirement for achieving a 1-log inactivation, compared with a surface application in the field (P < 0.05) (Fig. 5).
Influence of other factors on D-values
Manure types, manure states, soil types, and moisture were not shown to have significant effects on D-values (P > 0.05).
Comparison effect of laboratory trials versus field trials
Table 4 illustrates the median and confidence interval of decimal reduction time (days) of the pathogens at medium temperature (10 to 20°C), which were the most commonly observed conditions in field studies. Persistence of both E. coli and Salmonella spp. was relatively prolonged under laboratory-based studies. Results from a Wilcoxon signed-rank test comparing the persistence of the pathogens between the two different experimental conditions showed that the differences in decimal reduction time between laboratory and field trials of E. coli were significant (P < 0.05), whereas those of Salmonella spp. were nonsignificant (P > 0.05) (Table 4).
Decline rates of E. coli versus Salmonella spp
A Wilcoxon signed-rank test comparing the number of survival days between 33 and 52 observations of Salmonella and E. coli at medium temperature conditions from field trials suggested similar linear decline rates between these two pathogens. Identical results were shown from another comparison between 23 and 62 observations of Salmonella and E. coli at medium temperature conditions from laboratory trials. Although the median length of E. coli persistence was consistently longer than those of Salmonella spp. under field conditions, the difference was not significant (P > 0.05) (Table 4).
Table 5 illustrates the relative population decline (log CFU per gram) after exclusion periods (days) suggested in typical guidelines on the application of organic amendments to fruit and vegetable crops fields (Table 1). Based on median inactivation rates at medium temperature under field conditions (Table 4), which were 5.67 and 7.62 days for Salmonella spp. and E. coli, respectively, after the 90 and 120 days of exclusion periods suggested by typical guidelines for high-risk produce, the average population of Salmonella spp. and E. coli might decline by 15.87 to 21.16 and 11.81 to 15.75 log CFU/g, respectively (Table 5). The obtained 95% upper prediction levels of the D-values at medium temperature conditions under field conditions can be used as a conservative estimate of pathogen decline. Because times required for 1-log reduction were 14.67 and 25.33 days for E. coli and Salmonella spp., respectively (Table 4), after 120 days of exclusion, the population of E. coli and Salmonella spp. might decline by 8.18 and 4.74 log CFU/g, respectively (Table 5).
Results from this review revealed that temperature was the primary predictor for the inactivation rate of pathogens in manure-amended soil. Both E. coli and Salmonella spp. were shown to persist significantly longer at lower temperatures. This quantitatively confirms the findings from many individual studies examining the fate of E. coli in manure-amended soil (10, 13, 26, 42, 53). For example, longer survival times of both E. coli and Salmonella have been shown in spring-applied biosolids compared with those applied in summer (15). Whereas a previous study reported a weak correlation between decline rates of E. coli and temperature (17), our results showed a significant effect of temperature (P < 0.001) and suggested temperature as the primary predictor for pathogen decline in manure-amended soil. This is consistent with the work of McQuestin et al. (32), who identified the dominance of nonlethal temperature as an explanatory variable for the inactivation of E. coli in non-growth-permissive environments. A variety of mechanisms have been identified that promote the survival of bacteria in non-growth-permissive environments at lower temperatures (24, 51, 52). A challenge, however, remains in assessing other environmental factors that may interact with temperature effects, as highlighted by observations of enhanced persistence of E. coli at higher temperatures (30) and or limited differences between seasons (28). In addition to temperature, method of manure application was a significant factor in survival of foodborne pathogens. A positive correlation was found between incorporation (as compared with surface application) and persistence of E. coli under laboratory conditions, and for Salmonella spp. under field conditions. Pathogens may persist longer in manure incorporated or buried deeper in the soil in contrast to those spread on the soil surface (6, 28, 44, 46). Hruby et al. (2018) observed that tillage favored persistence of manure-derived Salmonella (27). Greater exposure of bacteria to sunlight (UV), temperature, or moisture fluctuations and atmospheric desiccation, caused by surface application of animal waste, may give rise to faster declines of bacterial populations (22, 29, 53). Note that this presents a significant limitation of this study, an assumption that the response of enteric pathogens in soils is a log-linear decline. This assumption is very rarely met in field studies. Thus, factors that may influence prolonged persistence exhibited by a shouldering and/or tailing effect were unable to be detected. This may explain why factors commonly associated with prolonged persistence, such as moisture and soil type, were not identified as significant.
Similar decline rates for Salmonella spp. and E. coli were found in this study. Results from the Wilcoxon test comparing the median die-off rates adopted from 23 to 62 inactivation curves between Salmonella and E. coli, respectively, were not significantly different. Whereas some studies have found that Salmonella may exhibit extended environmental survival compared with E. coli (43), our findings are consistent with contemporary studies by Chinivasagam (2008) and Ellis et al. (2018), which demonstrated that E. coli and Salmonella declined at similar rates in chicken broiler litter and biosolids-amended soil, respectively (2, 16). Assuming that pathogen decline is linear, the similarity in linear decline rates of these pathogens under selected field conditions suggests the suitability of using E. coli decline rate as a surrogate for the decline rate of Salmonella spp. and potentially other pathogens in manure-amended soil.
The initial levels of microorganisms present in manure can vary greatly. Previous studies have suggested feedlot cattle manure can carry up to 6.4 log CFU/g of pathogenic E. coli (31), and broiler litter can contain up to 5.0 log CFU/g (12) of Salmonella spp. Based on the average (median) inactivation rate at medium temperature under field conditions, after the 90 and 120 days of exclusion periods, the population decline of Salmonella spp. and E. coli by 15.87 to 21.16 and 11.81 to 15.75 log CFU/g, respectively, suggested that the average (median) response of enteric pathogens indicated die-off within an exclusion period suggested by typical guidelines for high-risk produce. The initial concentration of bacteria at the time of application is an essential consideration alongside the recommended exclusion periods for managing microbial contamination risks associated with the application of raw manure. According to Wang et al. (2018), the regrowth and recovery of pathogens to a culturable state a long time after introduction into the manure-amended soil suggested that the dormant cells within the initial population could be an important factor in addition to environmental factors (53). There is a lack of research considering whether the increase in E. coli and Salmonella spp. populations was due to growth or to increased culturability of dormant cells (7). Thus, the impact of the processes prior to soil addition, such as the composting method and the level and bacterial state of the pathogen, needs to be further investigated. Future research is required to establish a better understanding of microbial physiology and genetic mechanisms by which pathogens survive stresses, which will enhance our control efforts to minimize the risk of pathogen contamination in fresh produce.
Longer survival of E. coli and Salmonella spp. in laboratory-based trials compared with field-based trials was also observed in several primary studies (10, 29). Laboratory trials usually suggested lower die-off rates of enteric pathogens than those from field trials and, therefore, may overestimate environmental persistence (36). A laboratory-based study evaluating the influence of temperature fluctuations on the persistence of E. coli O157: H7 and Salmonella Typhimurium in cow manure observed inconsistent results in the relative populations of these pathogens between static and oscillating temperature (43). Results from this review, together with other studies (10, 29), illustrate the complicated persistence pattern of E. coli and Salmonella spp. under field conditions. This highlights that studies of pathogen persistence under temperature-controlled conditions fail to mimic persistence in the real world, where pathogens are subjected to multiple environmental factors, such as temperature fluctuations, UV radiation from sunlight, competition other organisms, and the drying effects of moving air. Future research is, therefore, recommended to improve laboratory-based experimental design, such as the incorporation of fluctuating moisture and temperature conditions and the use of nonsterile soil to better simulate natural environments.
The main challenge in the present analysis is the level of weighting of the studies included in the review. The majority of publications on the persistence of E. coli and Salmonella spp. in manure-amended soil are concentrated at medium temperature (10 to 20°C), which is representative for the agricultural environment in temperate zone countries. The number of laboratory-based studies was higher than those of field-based studies for both pathogens. Another challenge to this meta-analysis is the inconsistency in the reported data of temperature and moisture conditions, which reflects the variability of the considered environment and experimental methodologies used. Specifically, in field-based studies, moisture conditions were reported as either soil moisture content (%) and/or rainfall (mm) and/or RH (%). This inconsistency requires categorization of continuous variables before statistical analysis. Although a binary split to “high moisture” and “low moisture” greatly simplifies the statistical analysis and facilitates interpretation and presentation of results, it reduces statistical power to detect a relation between moisture conditions and D-values. An additional bias in selection was that only published papers written in English were considered. Unpublished studies, such as reports and theses, and those not written in English, were automatically excluded.
The initial decline of bacteria was integrated into this review, whereas both tailing effects and consecutive increasing data points within the survival curves were excluded. Tailing effect suggests slower decline rate by a subpopulation of more resistant bacteria, which is not reflective of the behavior of the majority of the population (5). Consecutive fluctuations within the decline curves under field conditions could not be linked to a specific factor that influenced regrowth or prolonged persistence of bacteria (53). The main practical problem of the inactivation models describing shoulders and tailing effects is the lack of parameter values for other than reported cases, so that they cannot be used for general predictive purpose (50). Therefore, both tailing and regrowth effects were neglected, and log-linear inactivation population models were used to balance the requirements between predictive accuracy and manageability of model complexity. Implementation of mechanistic models, including the use of individual-based models, has been widely used in ecological applications; however, these models are not commonly applied in predictive microbiology. The development of simplified mechanistic models based on the response of individual cells may present an opportunity to satisfy the trade-off between the description of population dynamics and increased application of the model in predictive microbiology (5).
This review summarizes and quantifies the decline rates of E. coli and Salmonella spp. under common laboratory and field conditions. The statistical analyses indicate that variability in pathogen decline in manure-amended soil could be significantly driven by variations in temperature conditions. In addition to using summarized decline rates to calculate the exposure value in quantitative microbial risk assessments, temperature conditions could be a potential realistic risk mitigation strategy within the model. This also indicates that seasonality and spatial temperature variation may be incorporated into predictions regarding relevant exclusion periods. Future laboratory or field studies need to control or measure the fluctuations of temperature and moisture conditions to provide more accurate models to predict pathogen decline rates. Based on the significant variation observed among individual field studies, it is unlikely the risks associated with the use of manure amendments containing high levels of enteric bacterial pathogens (such as in raw manure) in soils may be solely managed by a uniform exclusion period. Management of the risks associated with the use of soils amended with raw manures is best achieved through risk-based approaches incorporating differences in climate, soil management, and initial levels of bacteria during application. Finally, the significant implications of dormant cells within the initial population with respect to predictability of pathogen decline rates remain open to speculation.
We acknowledge Richard Bennett from Fresh Produce Safety Centre Australia and New Zealand for his support and guidance. This research was conducted within the Australian Research Council Training Centre for Food Safety in the Fresh Produce Industry (grant no. IC160100025) funded by the Australian Research Council, industry partners from Australia and New Zealand, and the University of Sydney.
Supplemental material associated with this article can be found online at: https://doi.org/10.4315/0362-028X.JFP-19-460.s1