Mark F. Miller
Texas Tech University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mark F. Miller.
Applied and Environmental Microbiology | 2008
David J. Kunze; Guy H. Loneragan; Tammy M. Platt; Mark F. Miller; Thomas E. Besser; Mohammad Koohmaraie; Tyler Stephens; Mindy M. Brashears
ABSTRACT Our objectives were to quantify the Salmonella enterica burdens in harvest-ready cattle and to identify specific at-risk populations of cattle most likely to harbor multiply resistant S. enterica. Hide swabs were collected in abattoirs from three cohorts of cattle (feedlot origin cattle that had achieved desirable harvest characteristics and dairy- and beef-type cows harvested because of poor productivity). Feces were collected from two cohorts housed in feedlots (cattle that had achieved desirable harvest characteristics and animals identified for salvage recovery because of poor productivity). Facilities were visited on four occasions over a 12-month period. Salmonella enterica isolates were recovered, and organisms were quantified using standard microbiological methodologies. Susceptibility to antimicrobial drugs and serotype were determined for one S. enterica isolate per sample. Salmonella enterica was recovered from 55.6% of 1,681 samples. The prevalences on hides and in feces were 69.6% and 30.3%, respectively. The concentrations of S. enterica organisms averaged (as determined by the most probable number technique) 1.82 log10/100 cm2 of hides and 0.75 log10/g of feces. None of the isolates recovered from cattle that had achieved desirable harvest characteristics were resistant to four or more drugs. For isolates recovered from animals with poor productivity characteristics, 6.5% were resistant to four or more drugs. Twenty-two serovars were identified, with the most common being Salmonella enterica serovar Anatum (25.5%), Salmonella enterica serovar Montevideo (22.2%), and Salmonella enterica serovar Cerro (12.5%). High-level resistance, i.e., resistance to four or more drugs, was clustered within a few relatively uncommon serovars. These results demonstrate that even though S. enterica isolates are readily recoverable from harvest-ready cattle, multiply resistant variants are rare and are associated with specific serovars in cattle harvested because of poor productivity characteristics.
Nutrition and Cancer | 1998
Barbara C. Pence; Melanie Landers; Dale M. Dunn; Chwan-Li Shen; Mark F. Miller
Epidemiologic studies have linked the consumption of red meat and the consumption of highly browned meats containing high levels of heterocyclic aromatic amines (HCAs) to increased risk of colorectal cancer or polyps. The present study determined the effects of long-term feeding of beef-containing diets with low and high levels of HCAs (in the context of a low or high beef tallow diet) on a standard 1,2-dimethylhydrazine (DMH)-induced colon tumorigenesis protocol. Very lean beef was cooked by a variety of methods at different temperatures, and the levels of the major HCAs (2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline, 2-amino-3,4,8-trimethylimidazo[4,5-f]quinoxaline, and 2-amino-1-methyl-6-phenylimidazo[4,5-f]pyridine) were measured by high-performance liquid chromatography. Diets incorporating beef containing low or high levels of HCAs were fed for 12 weeks, during which DMH was administered to induce colon tumors, followed by various dietary regimens as promotional stimuli. Feeding of a beef diet high in HCAs resulted in more DMH-induced colon adenocarcinomas, but only in the context of a low-fat diet. The high-HCA diets increased stomach tumors in all DMH-treated rats. An apparent interaction of high HCA with a high fat level reduced the colon tumor incidence and tumor numbers in those diets containing both factors. These results support the epidemiologic data linking well-cooked meat to increased risk for colon and stomach cancer, but the role of dietary fat level remains puzzling.
Journal of Food Protection | 1995
Danny W. Bawcom; Leslie Thompson; Mark F. Miller; C. Boyd Ramsey
The effects of continuous electrical current pulsed electrical current and voltage level on aerobic bacteria total coliforms and Salmonella typhimurium on top round beef steaks were examined. Electrical stimulation (620 v) for 20 and 60 s decreased (P<.05) coliform bacteria counts by an average of 81% (0.7 log CFU/cm2) compared to untreated steaks. Compared to non-sprayed steaks coliform counts were lower (P<.05) for steaks to which 3 ml of sterile deionized water was applied before electrical stimulation. Steaks subjected to 3 6 12 and 24 pulses (400 v/2.5 cm) reduced (P<.05) S. typhimurium counts compared to those on untreated steaks. A voltage level of 1200 v/2.5 cm reduced (P<.05) the numbers of S. typhimurium by 82% compared to steaks that Received no electrical stimulation. Electrical stimulation reduces numbers of bacteria present on beef surfaces.
Cancer Letters | 1997
Chris Lai; Dale M. Dunn; Mark F. Miller; Barbara C. Pence
Significant alarm has existed among the general public in the past few years that eating red meat may cause human colon cancer. Iron in beef has been hypothesized as one of the factors in the etiology of this cancer. The present study was designed to test the hypothesis that dietary iron solely from beef would enhance colon tumorigenesis induced in rats. Tumors were induced in Sprague-Dawley rats with 1,2-dimethylhydrazine (20 mg/kg body weight for 10 weeks). Seventy male weanling rats were randomized to two dietary treatment groups with two iron sources (very lean beef vs. iron citrate) as the factor. The rats were allowed free access to the respective diet and deionized water for 27 weeks. At termination of the study, the rats were examined for location, size and type of colon or extracolonic lesions. No significant differences were found in total incidence and number of colon tumors between the beef (51.7%, 0.8 tumors/rat) and casein (62.1%, 0.9 tumors/rat) diets, although the serum iron levels of rats fed the beef diet were higher than for those fed the casein diet. The results demonstrate that, when lean beef is used as an iron source, the risk for colon carcinogenesis is not increased.
Journal of Food Protection | 2012
Jessie L. Vipham; Mindy M. Brashears; Guy H. Loneragan; Alejandro Echeverry; J. Chance Brooks; W. Evan Chaney; Mark F. Miller
Salmonella enterica and Campylobacter spp. cause a considerable number of human illnesses each year, and the vast majority of cases are foodborne. The purpose of this study was to establish the baseline of Salmonella and Campylobacter in beef products purchased from U.S. retail markets. Sampling was carried out in 38 American cities. Retail raw ground and whole-muscle beef (n = 2,885) samples were purchased and examined for the presence of Salmonella. Samples testing positive for Salmonella were identified with the commercial BAX System, which is a real-time PCR-based system. Of the original samples purchased, 1,185 were selected and tested for the presence of Campylobacter. Positive samples were isolated via direct plating and confirmed via agglutination and biochemical testing. Salmonella was detected in 0.66% of the total samples purchased. The prevalence of Salmonella in ground beef packages was 0.42% for modified atmosphere packaging, 0.63% for chub packaging, and 0.59% for overwrapped packages. Salmonella was detected in 1.02% of whole-muscle cuts. There was no relationship (P = 0.18) between product type (ground or whole muscle) and the percentage of positive samples. Campylobacter was recovered from 9.3% of samples. A greater percentage (17.24%, P < 0.01) of whole-muscle cuts tested positive for Campylobacter compared with ground beef samples (7.35%). Estimating pathogen baselines in U.S. retail beef is essential for allotting resources and directing interventions for pathogen control. These data can be utilized for a more complete understanding of these pathogens and their impact on public health from the consumption of beef products.
Journal of Food Protection | 1998
Mandy A. Carr; Leslie Thompson; Mark F. Miller; C. Boyd Ramsey; Collette S. Kaster
The effects of chilling (normal chill or freeze chill) and trimming (hot fat trim or no fat trim) on the microbial populations of pork carcases were evaluated. In a two-part study, composited ham, loin, belly, and shoulder samples from 30 park carcasses had similar aerobic plate counts, averaging 5.5 log10 CFU/cm2. The nofat trim, normal chill procedure typically used in the industry, however, produced higher coliform and Staphylococcus spp. counts (P < 0.05). The hot fat trim, freeze chill treatment had the lowest lactic acid bacteria counts. Only 1 sample in 60 tested positive for Salmonella spp. Vacuum-packaged hams and loins stored at 4 degrees C for 14 days had similar (P > 0.05) aerobic plate counts, lactic acid bacteria and Staphylococcus spp. counts regardless of trim, chill, or the location of treatment, averaging 5.7, 6.3 and 1.4 log10 CFU/cm2, respectively. Hams had higher counts than loins all three days; however, only the difference on day 2 was significant. The desire to reduce microbial populations on pork carcasses as a food-safety issue and the coming implementation of hazard analysis critical control points (HACCP) programs warrants the use of trimming and chilling methods as critical control points or good manufacturing practices and standard operating procedures in the pork slaughter, processing, and packaging industry.
Journal of Food Protection | 1997
Kara S. Tinney; Mark F. Miller; C. Boyd Ramsey; Leslie Thompson; Mandy A. Carr
This study determined the effect of a 2% acetic acid spray, pulsed-power electricity, pulsed-power electricity with a spray of sterile deionized water, and a combination of acetic acid spray and pulsed-power electricity in reducing the pathogens Escherichia coli O157 and Salmonella typhimurium and aerobic plate counts on beefsteaks compared to an inoculated control. Ten steaks per treatment were inoculated with 1 ml of E. coli O157 (l05 CFU/ml) or S. typhimurium (105 CFU/ml) for 2 min and then subjected to one of the five treatments. Acetic acid spray and acetic acid spray and pulsed-power electricity treatments significantly (P < .05) reduced the incidence of Escherichia coli O157 compared to inoculated controls and produced a 1-log CFU/cm2 reduction in the incidence of S. typhimurium . Ten steaks per treatment were also inoculated with 1 ml of S. typhimurium (l05 CFU/ml) for 2 min, treated with acetic acid spray, pulsed-power electricity and a sterile deionized water spray, or acetic acid spray and pulsed-power electricity and stored in an incubator at -2°C for 48 h to stimulate chill-cooler conditions in the beef industry. Acetic acid spray with and without pulsed-power electricity caused a 1-log CFU/cm2 reduction in S. typhimurium . These data indicate a need for the use of both 2% acetic acid and pulsed-power electricity in packing-house facilities to help achieve the goal of improved microbiological safety of beef.
Meat Science | 2011
Corri L. Rekow; Mindy M. Brashears; J. Chance Brooks; Guy H. Loneragan; Sara E. Gragg; Mark F. Miller
The objective of this study was to define locations on the carcass with highest contamination of E. coli O157 throughout the harvest process and implement targeted interventions to reduce or eliminate contamination. To establish a pathogen baseline, samples were collected at the foreshank, hindshank, inside round, neck and midline area and evaluated for E. coli O157:H7 presence. Environmental samples were also collected in the harvest area and the fabrication area of the facility. E. coli O157:H7 prevalence was highest on the foreshank, hindshank and inside rounds in the baseline study and steam vacuums/cones were implemented as an intervention in these specific areas on the harvest floor. At pre-evisceration, foreshank prevalence of E. coli O157:H7 was significantly (P<0.05) reduced from 21.7% to 3.1% after the application of steam interventions. At the final rail, foreshank prevalence in the baseline study was 4.2% while no E. coli O157:H7 was detected post-intervention implementation. E. coli O157:H7 on hindshanks and inside rounds was significantly reduced after intervention implementation from 24.2 to 11.5% and 37.5 to 16.7%, respectively at the final rail. Pathogen contamination of environmental samples collected in fabrication declined from 6.7% to 0.7% after slaughter interventions were implemented. Data indicate the identifying areas of contamination on the carcass and implementing interventions can significantly reduce E. coli O157 on the carcasses and in the fabrication environment.
Translational Animal Science | 2018
Travis G. O’Quinn; J.F. Legako; J.C. Brooks; Mark F. Miller
Abstract The objectives of this study were to evaluate the contribution of tenderness, juiciness, and flavor to the overall consumer beef eating experience and to evaluate the risk of overall palatability failure due to the unacceptable level of one or more of these traits. Data from 11 previously conducted studies representing a wide range of treatments and levels of eating quality that included more than 1,500 beef samples and 1,800 consumers were compiled and analyzed for this study. Results of a multivariate regression indicated that tenderness, flavor, and juiciness accounted for 43.4%, 49.4%, and 7.4%, respectively, of overall palatability (P < 0.05; R2 > 0.99). Additionally, the odds of a steak being rated unacceptable overall when tenderness, juiciness, or flavor were rated unacceptable were 2.2 to 1 (69%), 1.9 to 1 (66%), and 3.3 to 1 (77%), respectively. This indicated overall palatability was 7.2, 6.5, and 12.3 times more likely to be rated unacceptable if tenderness, juiciness, or flavor, respectively, was also rated unacceptable. Additionally, the percentage of samples rated acceptable for each palatability trait increased (P < 0.05) as quality grade increased. More than 88% of USDA Prime samples were rated acceptable for each palatability trait, whereas only 74.8–77.3% of USDA Select samples were rated acceptable for each palatability trait. Marbling score accounted for 14–16% of the variation (P < 0.01) in consumer palatability scores for each trait and intramuscular fat percentage accounted for 17–21% of the variation in each trait (P < 0.01). Logistic equation models for the predicted probability of an acceptable rating for each palatability trait based on intramuscular fat percentage accounted for only a minimal amount of variation (P < 0.01; R2 ≤ 0.09). Results of this study indicate the relative contribution of tenderness, juiciness, and flavor to overall beef palatability. They provide evidence that the failure of even a single palatability trait dramatically increases the likelihood of overall palatability failure, indicating that no single palatability trait is most important, as beef palatability is dependent upon the acceptance of all three traits: tenderness, juiciness, and flavor.
Journal of Food Protection | 2015
Martha Maradiaga; Mark F. Miller; Leslie Thompson; Ansen Pond; Sara E. Gragg; Alejandro Echeverry; Lyda G. Garcia; Guy H. Loneragan; Mindy M. Brashears
Salmonella continues to cause a considerable number of foodborne illnesses worldwide. The sources of outbreaks include contaminated meat and produce. The purpose of this study was to establish an initial investigation of the burden of Salmonella in produce and beef from Honduras by sampling retail markets and abattoirs. Retail produce samples (cantaloupes, cilantro, cucumbers, leafy greens, peppers, and tomatoes; n = 573) were purchased in three major cities of Honduras, and retail whole-muscle beef (n = 555) samples were also purchased in four major cities. Additionally, both hide and beef carcass (n = 141) samples were collected from two Honduran abattoirs. Whole-muscle beef samples were obtained using a sponge hydrated with buffered peptone water, and 10 ml of the buffered peptone water rinsate of each produce sample was collected with a dry sponge and placed in a bag to be transported back to the United States. Salmonella was detected using a commercially available, closeplatform PCR system, and positive samples were subjected to culture on selective media to obtain isolates. Overall, the prevalence of Salmonella-positive samples, based on PCR detection in Honduras (n = 555) retail beef was 10.1% (95% confidence interval = 7.8, 12.9), whereas 7.8% (n = 141) of beef carcass and hides samples were positive in both beef plants. The overall Salmonella prevalence for all produce samples (n = 573) collected was 2.1% (95% confidence interval = 1.2, 3.6). The most common serotypes identified in Honduras were Salmonella Typhimurium followed by Derby. These results provide an indication of Salmonella contamination of beef and produce in Honduras. Developing a Salmonella baseline for Latin America through an initial investigation like the one presented here contributes to a broader global understanding of the potential exposure through food, thus providing insight into the needs for control strategies.