L. D. Walters
University of Wolverhampton
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by L. D. Walters.
Letters in Applied Microbiology | 2004
M Hutchison; L. D. Walters; Sheryl M Avery; B.A. Synge; A. Moore
Aims: To determine the prevalence and levels of zoonotic agents in livestock wastes.
Applied and Environmental Microbiology | 2005
M. L. Hutchison; L. D. Walters; Sheryl M Avery; F. Munro; A. Moore
ABSTRACT Survey results describing the levels and prevalences of zoonotic agents in 1,549 livestock waste samples were analyzed for significance with livestock husbandry and farm waste management practices. Statistical analyses of survey data showed that livestock groups containing calves of <3 months of age, piglets, or lambs had higher prevalences and levels of Campylobacter spp. and Escherichia coli O157 in their wastes. Younger calves that were still receiving milk, however, had significantly lower levels and prevalence of E. coli O157. Furthermore, when wastes contained any form of bedding, they had lowered prevalences and levels of both pathogenic Listeria spp. and Campylobacter spp. Livestock wastes generated by stock consuming a diet composed principally of grass were less likely to harbor E. coli O157 or Salmonella spp. Stocking density did not appear to influence either the levels or prevalences of bacterial pathogens. Significant seasonal differences in prevalences were detected in cattle wastes; Listeria spp. were more likely to be isolated in March to June, and E. coli O157 was more likely to be found in May and June. Factors such as livestock diet and age also had significant influence on the levels and prevalences of some zoonotic agents in livestock wastes. A number of the correlations identified could be used as the basis of a best-practice disposal document for farmers, thereby lowering the microbiological risks associated with applying manures of contaminated livestock to land.
Applied and Environmental Microbiology | 2004
M. L. Hutchison; L. D. Walters; A. Moore; K. M. Crookes; Sheryl M Avery
ABSTRACT In response to reports that the contamination of food can occur during the on-farm primary phase of food production, we report data that describes a possible cost-effective intervention measure. The effect of time before soil incorporation of livestock wastes spread to land on the rate of decline of zoonotic agents present in the waste was investigated. Fresh livestock wastes were inoculated with laboratory-cultured Salmonella, Listeria, and Campylobacter spp. and Escherichia coli O157 before they were spead onto soil. Incorporation of the spread wastes was either immediate, delayed for 1 week, or did not occur at all. Bacterial decline was monitored over time and found to be significantly more rapid for all waste types when they were left on the soil surface. There were no significant differences in initial bacterial decline rates when wastes were spread in summer or winter. Our results indicate that not incorporating contaminated livestock wastes into soil is a potential intervention measure that may help to limit the spread of zoonotic agents further up the food chain. The implications of these findings are discussed in relation to current advice for livestock waste disposal.
Applied and Environmental Microbiology | 2005
Mike L. Hutchison; L. D. Walters; Tony Moore; D. John I. Thomas; Sheryl M Avery
ABSTRACT Fecal wastes from a variety of farmed livestock were inoculated with livestock isolates of Escherichia coli O157, Listeria monocytogenes, Salmonella, Campylobacter jejuni, and Cryptosporidium parvum oocysts at levels representative of the levels found in naturally contaminated wastes. The wastes were subsequently spread onto a grass pasture, and the decline of each of the zoonotic agents was monitored over time. There were no significant differences among the decimal reduction times for the bacterial pathogens. The mean bacterial decimal reduction time was 1.94 days. A range of times between 8 and 31 days for a 1-log reduction in C. parvum levels was obtained, demonstrating that the protozoans were significantly more hardy than the bacteria. Oocyst recovery was more efficient from wastes with lower dry matter contents. The levels of most of the zoonotic agents had declined to below detectable levels by 64 days. However, for some waste types, 128 days was required for the complete decline of L. monocytogenes levels. We were unable to find significant differences between the rates of pathogen decline in liquid (slurry) and solid (farmyard manure) wastes, although concerns have been raised that increased slurry generation as a consequence of more intensive farming practices could lead to increased survival of zoonotic agents in the environment.
Journal of Applied Microbiology | 2005
M Hutchison; L. D. Walters; A. Moore; Sheryl M Avery
Aim: To measure the decline rates of zoonotic agents introduced into liquid livestock wastes in on‐farm storage tanks.
Journal of Applied Microbiology | 2005
M Hutchison; L. D. Walters; Sheryl M Avery; A Moore
Aims: To measure the rates of decline of zoonotic agents introduced into heaps of spent bedding and faecal wastes generated by commercially farmed livestock and managed in a similar way to that of a working farm.
Journal of Food Protection | 2005
Michael L. Hutchison; L. D. Walters; Sheryl M. Avery; Carol-Ann Reid; Douglas Wilson; Mary Howell; Alexander M. Johnston; Sava Buncic
A comparison of wet-dry swabbing and surface tissue excision of carcasses by coring was undertaken. Samples from 1,352 bovine, 188 ovine, and 176 porcine carcasses were collected from 70 separate visits to commercial slaughterhouses operating under normal conditions. The mean total aerobic viable bacterial counts (TVCs) for all species sampled by excision was 5.36 log units, which was significantly greater than the 4.35 log units measured for swabbing. Poorly correlated linear relationships between swab- and excision-derived bacterial numbers from near-adjacent carcasses were observed for all three animal species. R2 values for least squares regressions for bovine, ovine, and porcine carcasses were 0.09, 0.27, and 0.21, respectively. The reasons why it was not possible to calculate a factor that allowed the interconversion of bacterial numbers between samples collected by each sampling method were investigated. Uncertainty associated with laboratory analyses was a contributing factor because the geometric relative standard deviations measured for TVCs were 0.174 and 0.414 for excision and swabbing, respectively. Uneven distribution of bacteria at identical sampling sites on near-adjacent carcasses on processing lines was also a contributory factor. The implications of these findings for process control verification were investigated by intensive sampling for 13 weeks in three commercial slaughterhouses. As many as 4 log units of difference in TVCs were observed in duplicate samples collected within a narrow timeframe from near-adjacent carcasses on the processing line. We conclude that it might not be appropriate to institute corrective actions in slaughterhouses on the basis of a single weeks test results.
Journal of Food Protection | 2005
Richard Pepperell; Carol-Ann Reid; Silvia Nicolau Solano; Michael L. Hutchison; L. D. Walters; Alexander M. Johnston; Sava Buncic
Bovine sides, ovine carcasses, and porcine carcasses were individually inoculated by dipping in various suspensions of a marker organism (Escherichia coli K-12 or Pseudomonas fluorescens), alone or in combination with two meat-derived bacterial strains, and were sampled by two standard methods: cotton wet-dry swabbing and excision. The samples were examined for bacterial counts on plate count agar (PCA plate counts) and on violet red brilliant green agar (VRBGA plate counts) by standard International Organization for Standardization methods. Average bacterial recoveries by swabbing, expressed as a percentage of the appropriate recoveries achieved by excision, varied widely (2 to 100%). Several factors that potentially contributed to relatively low and highly variable bacterial recoveries obtained by swabbing were investigated in separate experiments. Neither the difference in size of the swabbed area (10, 50, or 100 cm2 on beef carcasses) nor the difference in time of swabbing (20 or 60 min after inoculation of pig carcasses) had a significant effect on the swabbing recoveries of the marker organism used. In an experiment with swabs preinoculated with the marker organism and then used for carcass swabbing, on average, 12% of total bacterial load was transferred inversely (i.e., from the swab to the carcass during the standard swabbing procedure). In another experiment, on average, 14% of total bacterial load was not released from the swab into the diluent during standard swab homogenization. Use of custom-made swabs with abrasive butts, around which metal pieces of pan scourers were wound, markedly increased PCA plate count recoveries from noninoculated lamb carcasses at commercial abattoirs compared with cotton swabs. In spite of the observed inferiority of the cotton wet-dry swabbing method compared with the excision method for bacterial recovery, the former is clearly preferred by the meat industry because it does not damage the carcass. Therefore, further large-scale evaluation of the two carcass sampling methods has been undertaken under commercial conditions and reported separately.
Journal of Food Protection | 2006
Michael Hutchison; L. D. Walters; G. C. Mead; Mary Howell; V. M. Allen
Studies to determine the appropriateness of the use of populations of indicator bacteria on poultry carcasses for process verification were undertaken in commercial slaughterhouses. Samples were collected from neck skin by excision or from whole carcass rinses and were examined for a range of presumptive process hygiene indicator bacteria. Coefficients of variation were calculated for each bacterial indicator and were significantly lower in excised samples, indicating more reproducible bacterial recovery by this sampling method. Total viable counts of aerobic bacteria, Enterobacteriaceae, and Pseudomonas in samples collected by excision had the lowest coefficients of variation when compared with other indicators and were therefore used for further study. The uncertainties associated with the quantification of each bacterial indicator were calculated and were lowest overall for total viable counts of aerobic bacteria. In general, uncertainty was higher for lower bacterial numbers. Results of microbiological testing on pooled excised neck skin samples were not significantly different from the mean of individually analyzed samples. Bacterial numbers increased by 1 log unit when cultures were stored under chilled conditions typical of those used for transporting samples to external laboratories, but the increases were not significant for Pseudomonas and aerobic bacteria when storage time was less than 17 h. Weak relationships were identified between bacterial indicator numbers and duration of processing, although cleanliness of the processing environment diminished visibly during this time. In the plants visited for this study, there was a poor relationship between presumptive bacterial indicator numbers and process hygiene. Consequently, bacterial analyses for process verification purposes may be of limited value.
Journal of Food Protection | 2006
Michael Hutchison; L. D. Walters; V. M. Allen; G. C. Mead; Mary Howell
An assessment of the proposed new International Organization for Standardization quantitative method for Campylobacter was undertaken on poultry carcass samples collected after the chilling phase of processing. Using a critical differences method, we determined the uncertainty associated with log-transformed Campylobacter numbers by dual analyses of 346 samples collected from 22 processing plants located throughout the United Kingdom. Overall, using log-transformed Campylobacter numbers that ranged between -1 and 5 log, we calculated the expanded measurement of uncertainty (EMU) to be 3.889 for the new method. The EMU changed when ranges of bacterial numbers were grouped for analyses. For low numbers of Campylobacter (< 1 log), the EMU was calculated to be 5.622. There was less measurement error with higher bacterial numbers because the EMU was found to be 0.612 for samples containing Campylobacter numbers of 3 log or above. The draft method was used to measure numbers of Campylobacters on poultry carcasses collected from 18 United Kingdom processing plants in summer and winter. Numbers were significantly lower in winter. We conclude that, although the new method is adequate at quantifying high numbers of Campylobacter on poultry carcasses, further development is required to improve the measurement of small numbers of this causative agent of foodborne illness.