E. F. Schwandt
Kansas State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by E. F. Schwandt.
Journal of Animal Science | 2016
E. F. Schwandt; J. J. Wagner; T. E. Engle; S. J. Bartle; Daniel U. Thomson; Christopher D. Reinhardt
Crossbred yearling steers ( = 360; 395 ± 33.1 kg initial BW) were used to evaluate the effects of dry-rolled corn (DRC) particle size in diets containing 20% wet distillers grains plus solubles on feedlot performance, carcass characteristics, and starch digestibility. Steers were used in a randomized complete block design and allocated to 36 pens (9 pens/treatment, with 10 animals/pen). Treatments were coarse DRC (4,882 μm), medium DRC (3,760 μm), fine DRC (2,359 μm), and steam-flaked corn (0.35 kg/L; SFC). Final BW and ADG were not affected by treatment ( > 0.05). Dry matter intake was greater and G:F was lower ( < 0.05) for steers fed DRC vs. steers fed SFC. There was a linear decrease ( < 0.05) in DMI in the final 5 wk on feed with decreasing DRC particle size. Fecal starch decreased (linear, < 0.01) as DRC particle size decreased. In situ starch disappearance was lower for DRC vs. SFC ( < 0.05) and linearly increased ( < 0.05) with decreasing particle size at 8 and 24 h. Reducing DRC particle size did not influence growth performance but increased starch digestion and influenced DMI of cattle on finishing diets. No differences ( > 0.10) were observed among treatments for any of the carcass traits measured. Results indicate improved ruminal starch digestibility, reduced fecal starch concentration, and reduced DMI with decreasing DRC particle size in feedlot diets containing 20% wet distillers grains on a DM basis.
Translational Animal Science | 2018
T. Lee; Christopher D. Reinhardt; S. J. Bartle; E. F. Schwandt; Michelle S Calvo-Lorenzo; Christopher Vahl; J. A. Hagenmaier; Matthew J Ritter; Gary J Vogel; Daniel U. Thomson
Abstract Cattle mobility is routinely measured at commercial slaughter facilities. However, the clinical signs and underlying causes of impaired mobility of cattle presented to slaughter facilities are poorly defined. As such, the objectives of this study were 1) to determine the prevalence of impaired mobility in finished cattle using a 4-point mobility scoring system and 2) to observe clinical signs in order to provide clinical diagnoses for this subset of affected cattle. Finished beef cattle (n = 65,600) were observed by a veterinarian during the morning shift from six commercial abattoirs dispersed across the United States; the veterinarian assigned mobility scores (MS) to all animals using a 1–4 scale from the North American Meat Institute’s Mobility Scoring System, with 1 = normal mobility and 4 = extremely limited mobility. Prevalence of MS 1, 2, 3, and 4 was 97.02%, 2.69%, 0.27%, and 0.01%, respectively. Animals with an abnormal MS (MS > 1) were then assigned to one of five clinical observation categories: 1) lameness, 2) poor conformation, 3) laminitis, 4) Fatigued Cattle Syndrome (FCS), and 5) general stiffness. Of all cattle observed, 0.23% were categorized as lame, 0.20% as having poor conformation, 0.72% as displaying signs of laminitis, 0.14% as FCS, and 1.68% as showing general stiffness. The prevalence of lameness and general stiffness was greater in steers than heifers, whereas the prevalence of laminitis was the opposite (P < 0.05). FCS prevalence was higher in dairy cattle than in beef cattle (0.31% vs. 0.22%, respectively; P ≤ 0.05). These data indicate the prevalence of cattle displaying abnormal mobility at slaughter is low and causes of abnormal mobility are multifactorial.
Translational Animal Science | 2017
E. J. McCoy; T. G. O’Quinn; E. F. Schwandt; Christopher D. Reinhardt; Daniel U. Thomson
Abstract Strip loin steaks (n = 119) were used to evaluate the association between liver abscess severity and USDA quality grade and meat tenderness and sensory attributes of steaks from finished feedlot cattle. Steaks were used in a 3 × 2 factorial treatment structure using a completely randomized design and were collected at a commercial abattoir located in northwest Texas. All cattle were sourced from a single feedlot and fed a common diet that did not include tylosin phosphate. Treatments were USDA quality grades of Select (SEL) and Low Choice (LC) and liver abscess scores of normal (NORM; healthy liver, no abscesses), mild (M; 1 abscess less than 2 cm in diameter to 4 abscesses less than 4 cm in diameter), and severe (SV; 1 abscess greater than 4 cm in diameter or greater than 4 small abscesses). All steak samples were collected on the same day, approximately 36-h post-mortem and were cut from the left side of the carcass at the 13th rib by a trained abattoir employee. Steaks were vacuum-packaged, and aged at 3 ± 1°C for 14-d post-mortem. Warner-Bratzler Shear Force (WBSF) and Slice Shear Force (SSF) analyses were conducted and cook-loss percentage was measured. A trained sensory panel analyzed samples for juiciness, tenderness, and flavor attributes. There were no differences among liver abscess scores for WBSF or SSF (P > 0.52). Warner-Bratzler Shear Force was lower for LC-SV than SEL-SV (P = 0.04). Sensory attributes of initial and sustained juiciness, and overall tenderness were all greater for LC than for SEL steaks (P < 0.04) and connective tissue amount was less for LC steaks when compared to SEL (P = 0.03). Liver abscess score had no effect on any sensory attributes (P > 0.70); however, there was an interaction between quality grade and liver score for myofibillar tenderness (P = 0.03). Within LC steaks, liver abscess score had no effect on myofibrillar tenderness (P > 0.05), however, in SEL steaks, M steaks were more tender than SV steaks (P < 0.03). These results indicate that within quality grades, meat tenderness or sensory attributes were not influenced by liver abscess score but that mild liver abscesses may affect the myofibrillar tenderness of SEL steaks.
Kansas Agricultural Experiment Station Research Reports | 2017
T. Miller; M. E. Hubbert; E. F. Schwandt; Daniel U. Thomson; Christopher D. Reinhardt
The cost of Bovine Respiratory Disease to the beef industry due to death, poorer conversions, and therapy is estimated to cost more than
Kansas Agricultural Experiment Station Research Reports | 2017
E. F. Schwandt; M. E. Hubbert; Daniel U. Thomson; Christopher Vahl; S. J. Bartle; Christopher D. Reinhardt
3 billion per year. Identifying and mitigating Bovine Respiratory Disease in cattle can be difficult due to the increased susceptibility for Bovine Respiratory Disease in high risk cattle. One management option to minimize an outbreak of respiratory disease is the use of metaphylaxis, the mass treatment of a group of calves to reduce the incidence and adverse effects of respiratory disease on high risk animals. Criteria used to determine the necessity of metaphylactic treatment against Bovine Respiratory Disease in feedlots can be based on several factors depending on feedlot preference; however, the primary criteria often considered are: a known history of no previous vaccinations, overall appearance of cattle, source of cattle, Bovine Respiratory Disease in calves received from same source previously, long shipping distance, season of the year, and light arrival weight. The objective of this study was to compare the efficacy of treating newly received, highrisk feedlot calves with gamithromycin, tulathromycin, and tilmicosin as metaphylactic treatments on health and performance characteristics.
Kansas Agricultural Experiment Station Research Reports | 2017
E. J. McCoy; T. G. O'Quinn; E. F. Schwandt; Christopher D. Reinhardt; Daniel U. Thomson
Steam-flaked corn is commonly fed in feedlot finishing diets because steam-flaking improves starch availability and nutrient utilization, thus improving the overall feeding value of corn. In most operations which utilize steam-flaked corn, grain is processed to a pre-determined flake density by setting the rolls to a specific separation distance and using tension to hold rolls together. Flaked grain is most often produced to a bulk density between 24 and 32 lb/bu, with a common recommendation of 27 lb/bu for corn; however, flake density among steam-flakers within a single mill and among feedlots can vary greatly. Flaking to a similar density using 2 flakers does not ensure similar starch availability. The degree of starch gelatinization or starch availability of steam-flaked corn can be estimated using analytical procedures such as enzymatic hydrolysis, gas production, and steam-flaked corn gelatinization methods. Routinely evaluating starch availability is used as a quality control method to standardize the steam-flaking process to ensure within-day and day-to-day manufacturing consistency. The concentration of readily available starch in steam-flaked corn is indicative of the rate of starch fermentation in the rumen. When starch is too readily available and is fermented at an excessively rapid rate, acid can accumulate in the rumen, reducing ruminal pH, and ultimately resulting in increased prevalence of digestive disturbances. Factors that contribute to variation between feedlot operations with respect to steamflaked corn quality include type and dimensions of flaking equipment, grain type, grain variety and moisture content, roll wear, and steam-flaking procedures. Sampling and handling procedures contribute to precision of results; therefore, sampling procedures need special attention, and consistency must be evaluated when attempting to determine starch availability of steam-flaked corn. The objective of this study was to evaluate starch availability of steam-flaked corn comparing roll dimensions and steam-flaked corn flake densities among flaking systems and feedyards and to provide information on the equipment utilized, steam-flaked corn flaking procedures, and to define current manufacturing practices of steam-flaking in commercial feedlot operations.
Kansas Agricultural Experiment Station Research Reports | 2017
E. F. Schwandt; J. Wagner; T. Engle; S. J. Bartle; Daniel U. Thomson; Christopher D. Reinhardt
Liver abscesses are a significant problem in the United States’ cattle feeding industry, costing the industry an estimated
Kansas Agricultural Experiment Station Research Reports | 2017
M. E. Youngers; E. F. Schwandt; Daniel U. Thomson; J. C. Simroth; S. J. Bartle; M. Siemens; Christopher D. Reinhardt
15.9 million annually in liver condemnation, trim losses, and reduced carcass weights and quality grades. Recent reported incidence rates of liver abscesses at slaughter range from 10 to 20%. Liver abscess incidence may be influenced by a number of factors including: breed, gender, diet, days on feed, cattle type, season, and geographical location. Liver abscesses typically occur secondary to rumen insults caused by acidosis or rumenitis. It has been proposed that pathogens associated with liver abscess formation enter the blood stream through damaged rumen epithelium and are transported to the liver through the portal vein where they cause infection, manifested as liver abscesses. Severe liver abscesses have been linked to reduction in hot carcass weight, dressing percentage, yield grade, longissimus muscle area, and marbling scores of carcasses when compared to those with normal livers. However, the effect of liver abscesses on meat tenderness and sensory attributes has not been previously investigated.
Kansas Agricultural Experiment Station Research Reports | 2017
J. C. Simroth; Daniel U. Thomson; E. F. Schwandt; S. J. Bartle; C. K. Larson; Christopher D. Reinhardt
Dry-rolling corn is a common practice in feedlots located in the Midwestern and Northern Plains regions of the United States. Optimizing total digestive tract starch utilization in diets containing dry-rolled corn is essential for maximizing efficiency. However, recommendations often suggest that grain be coarsely cracked to avoid producing an excessive amount of fine material that could potentially increase the rate of fermentation, reduce rumen pH, and cause digestive disturbances. Wet distillers byproducts may be effectively used as a protein and energy source for feedlot finishing cattle and can replace a portion of the dry-rolled corn in the diet. The average geometric mean particle size of dry-rolled corn across all feedyards (n = 31) was 0.179 ± 0.035 in. with a range of 0.085 to 0.269 in. The objective of this study was to evaluate the effects of dry-rolled corn particle size on animal performance, carcass traits, and starch digestibility in feedlot finishing diets containing 20% wet distillers grains on a dry matter basis.
Kansas Agricultural Experiment Station Research Reports | 2017
Daniel U. Thomson; M. E. Youngers; E. F. Schwandt; S. J. Bartle; M. Siemens; J. C. Simroth; Christopher D. Reinhardt
Disbudding and dehorning are two common practices done to remove horns from cattle to prevent injury to handlers and other cattle and to reduce bruising of carcasses. Bruised carcasses result in substantial reduction in profit due to trim loss, increased sanitation risk, and loss in time on the rail during processing. Previous research has indicated that cattle with horns increased hide damage of cohorts and caused injury to handlers. Cattle with horns cause circular shaped bruises that lead to trim loss due to bruising. Cattle with tipped horns do not have a lower bruising rate than cattle with intact horns. The objective of this study was to evaluate the effect of horn prevalence within groups of slaughter animals and the incidence of bruising on the carcasses of those same cattle.