Bradley J. White
Kansas State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bradley J. White.
American Journal of Veterinary Research | 2008
Johann F. Coetzee; Brian V. Lubbers; Scott Toerber; Ronette Gehring; Daniel U. Thomson; Bradley J. White; Michael D. Apley
OBJECTIVE To evaluate plasma concentrations of substance P (SP) and cortisol in calves after castration or simulated castration. ANIMALS 10 Angus-crossbred calves. PROCEDURES Calves were acclimated for 5 days, assigned to a block on the basis of scrotal circumference, and randomly assigned to a castrated or simulated-castrated (control) group. Blood samples were collected twice before, at the time of (0 hours), and at several times points after castration or simulated castration. Vocalization and attitude scores were determined at time of castration or simulated castration. Plasma concentrations of SP and cortisol were determined by use of competitive and chemiluminescent enzyme immunoassays, respectively. Data were analyzed by use of repeated-measures analysis with a mixed model. RESULTS Mean +/- SEM cortisol concentration in castrated calves (78.88+/-10.07 nmol/L) was similar to that in uncastrated control calves (73.01+/-10.07 nmol/L). However, mean SP concentration in castrated calves (506.43+/-38.11 pg/mL) was significantly higher than the concentration in control calves (386.42+/-40.09 pg/mL). Mean cortisol concentration in calves with vocalization scores of 0 was not significantly different from the concentration in calves with vocalization scores of 3. However, calves with vocalization scores of 3 had significantly higher SP concentrations, compared with SP concentrations for calves with vocalization scores of 0. CONCLUSIONS AND CLINICAL RELEVANCE Similar cortisol concentrations were measured in castrated and control calves. A significant increase in plasma concentrations of SP after castration suggested a likely association with nociception. These results may affect assessment of animal well-being in livestock production systems.
BMC Veterinary Research | 2012
Johann F. Coetzee; Ruby A. Mosher; Butch KuKanich; Ronette Gehring; Brad Robert; J. Brandon Reinbold; Bradley J. White
BackgroundDehorning is a common practice involving calves on dairy operations in the United States. However, less than 20% of producers report using analgesics or anesthetics during dehorning. Administration of a systemic analgesic drug at the time of dehorning may be attractive to dairy producers since cornual nerve blocks require 10 – 15 min to take effect and only provide pain relief for a few hours. The primary objectives of this trial were to (1) describe the compartmental pharmacokinetics of meloxicam in calves after IV administration at 0.5 mg/kg and (2) to determine the effect of meloxicam (n = 6) or placebo (n = 6) treatment on serum cortisol response, plasma substance P (SP) concentrations, heart rate (HR), activity and weight gain in calves after scoop dehorning and thermocautery without local anesthesia.ResultsPlasma meloxicam concentrations were detectable for 50 h post-administration and fit a 2-compartment model with a rapid distribution phase (mean T½α = 0.22 ± 0.087 h) and a slower elimination phase (mean T½β = 21.86 ± 3.03 h). Dehorning caused a significant increase in serum cortisol concentrations and HR (P < 0.05). HR was significantly lower in the meloxicam-treated calves compared with placebo-treated calves at 8 h (P = 0.039) and 10 h (P = 0.044) after dehorning. Mean plasma SP concentrations were lower in meloxicam treated calves (71.36 ± 20.84 pg/mL) compared with control calves (114.70 ± 20.84 pg/mL) (P = 0.038). Furthermore, the change in plasma SP from baseline was inversely proportional to corresponding plasma meloxicam concentrations (P = 0.008). The effect of dehorning on lying behavior was less significant in meloxicam-treated calves (p = 0.40) compared to the placebo-treated calves (P < 0.01). Calves receiving meloxicam prior to dehorning gained on average 1.05 ± 0.13 kg bodyweight/day over 10 days post-dehorning compared with 0.40 ± 0.25 kg bodyweight/day in the placebo-treated calves (p = 0.042).ConclusionsTo our knowledge, this is the first published report examining the effects of meloxicam without local anesthesia on SP, activity and performance of calves post-dehorning. These findings suggest that administration of meloxicam alone immediately prior to dehorning does not mitigate signs of acute distress but may have long term physiological, behavior and performance effects.
Scientific Reports | 2015
Shi Chen; Bradley J. White; Michael W. Sanderson; David E. Amrine; Amiyaal Ilany; Cristina Lanzas
Contact patterns among hosts are considered as one of the most critical factors contributing to unequal pathogen transmission. Consequently, networks have been widely applied in infectious disease modeling. However most studies assume static network structure due to lack of accurate observation and appropriate analytic tools. In this study we used high temporal and spatial resolution animal position data to construct a high-resolution contact network relevant to infectious disease transmission. The animal contact network aggregated at hourly level was highly variable and dynamic within and between days, for both network structure (network degree distribution) and individual rank of degree distribution in the network (degree order). We integrated network degree distribution and degree order heterogeneities with a commonly used contact-based, directly transmitted disease model to quantify the effect of these two sources of heterogeneity on the infectious disease dynamics. Four conditions were simulated based on the combination of these two heterogeneities. Simulation results indicated that disease dynamics and individual contribution to new infections varied substantially among these four conditions under both parameter settings. Changes in the contact network had a greater effect on disease dynamics for pathogens with smaller basic reproduction number (i.e. R0 < 2).
Veterinary Clinics of North America-food Animal Practice | 2013
Miles E. Theurer; David E. Amrine; Bradley J. White
Cattle behavior is frequently monitored to determine the health of the animal. This article describes potential benefits and challenges of remotely monitoring cattle behavior with available methodologies. The behavior of interest, labor required, and monitoring expenses must be considered before deciding which remote behavioral monitoring device is appropriate. Monitoring the feeding behavior of an animal over time allows establishment of a baseline against which deviations can be evaluated. Interpretation of multiple behavioral responses as an aggregate indicator of animal wellness status instead of as individual outcomes may be a more accurate measure of true state of pain or wellness.
Preventive Veterinary Medicine | 2013
A. H. Babcock; Natalia Cernicchiaro; Bradley J. White; Suzanne R. Dubnicka; Daniel U. Thomson; Samuel E. Ives; H.M. Scott; George A. Milliken; David G. Renter
Economic losses due to cattle mortality and culling have a substantial impact on the feedlot industry. Since criteria for culling may vary and may affect measures of cumulative mortality within cattle cohorts, it is important to assess both mortality and culling when evaluating cattle losses over time and among feedlots. To date, there are no published multivariable assessments of factors associated with combined mortality and culling risk. Our objective was to evaluate combined mortality and culling losses in feedlot cattle cohorts and quantify effects of commonly measured cohort-level risk factors (weight at feedlot arrival, gender, and month of feedlot arrival) using data routinely collected by commercial feedlots. We used retrospective data representing 8,904,965 animals in 54,416 cohorts from 16 U.S. feedlots from 2000 to 2007. The sum of mortality and culling counts for each cohort (given the number of cattle at risk) was used to generate the outcome of interest, the cumulative incidence of combined mortality and culling. Associations between this outcome variable and cohort-level risk factors were evaluated using a mixed effects multivariable negative binomial regression model with random effects for feedlot, year, month and week of arrival. Mean arrival weight of the cohort, gender, and arrival month and a three-way interaction (and corresponding two-way interactions) among arrival weight, gender and month were significantly (P<0.05) associated with the outcome. Results showed that as the mean arrival weight of the cohort increased, mortality and culling risk decreased, but effects of arrival weight were modified both by the gender of the cohort and the month of feedlot arrival. There was a seasonal pattern in combined mortality and culling risk for light and middle-weight male and female cohorts, with a significantly (P<0.05) higher risk for cattle arriving at the feedlot in spring and summer (March-September) than in cattle arriving during fall, and winter months (November-February). Our results quantified effects of covariate patterns that have been heretofore difficult to fully evaluate in smaller scale studies; in addition, they illustrated the importance of utilizing multivariable approaches when quantifying risk factors in heterogeneous feedlot populations. Estimated effects from our model could be useful for managing financial risks associated with adverse health events based on data that are routinely available.
Journal of Animal Science | 2009
Bradley J. White; D.A. Blasi; L. C. Vogel; M. Epp
Cattle transportation by commercial truck carrier is common in the United States, and often cattle are placed within 1 of 8 potential compartments within the truck for the journey. The objective of this research was to determine potential associations between animal wellness (as measured by ADG and health outcomes) during a relatively short backgrounding phase (46.6 +/- 8.5 d) and location within the truck during transit. Data from 21 loads (average calves per load = 101.5; average BW = 210.1 +/- 19.4 kg) were included in the analysis. For each shipment, calves were divided among 8 compartments within the trailer: nose on top deck (NOT), nose on bottom deck (NOB), bottom deck middle forward (BDF), bottom deck middle rear (BDR), rear on the bottom (ROB), top deck middle forward (TDF), top deck middle rear (TDR), and rear on the top deck (ROT). General logistic (health outcomes) and mixed (ADG) models were employed to analyze the data accounting for effects due to truck section as well as the hierarchical data structure of multiple arrival times, loads, and pens. Cattle in the ROT section had less short-term BW gains compared with NOT and tended (P < 0.10) to be less than NOB. Cattle in the forward sections (NOT, NOB) were less (P = 0.02) likely [odds ratio (OR): 0.67, 95% confidence limits (CL): 0.50, 0.94] to be treated at least once compared with cattle in the middle sections (TDF, TDR, TOP, BDF, BDR, BOT). Calves in compartments with 15 head or less tended (P < 0.10) to have reduced odds of being treated compared with cattle in compartments with 16 to 30 head (OR: 0.79, 95% CL: 0.60, 1.0) or greater than 31 head (OR: 0.73, 95% CL: 0.53, 1.0). Our current project reveals that the location within the truck may affect calf health and performance.
Preventive Veterinary Medicine | 2014
Rebecca L. Smith; Michael W. Sanderson; Rodney D. Jones; Yapo N’Guessan; David G. Renter; Robert L. Larson; Bradley J. White
A stochastic model was designed to calculate the cost-effectiveness of biosecurity strategies for bovine viral diarrhea virus (BVDV) in cow-calf herds. Possible sources of BVDV introduction considered were imported animals, including the calves of pregnant imports, and fenceline contact with infected herds, including stocker cattle raised in adjacent pastures. Spread of BVDV through the herd was modeled with a stochastic SIR model. Financial consequences of BVDV, including lost income, treatment costs, and the cost of biosecurity strategies, were calculated for 10 years, based on the risks of a herd with a user-defined import profile. Results indicate that importing pregnant animals and stockers increased the financial risk of BVDV. Strategic testing in combination with vaccination most decreased the risk of high-cost outbreaks in most herds. The choice of a biosecurity strategy was specific to the risks of a particular herd.
Preventive Veterinary Medicine | 2010
Rebecca L. Smith; Michael W. Sanderson; David G. Renter; Robert L. Larson; Bradley J. White
A stochastic SIR model was developed to simulate the spread of bovine viral diarrhea virus (BVDV) through a cow-calf herd and estimate the effect of the virus on the herd, including abortions, calf morbidity, and calf mortality. The model was applied with three herd sizes (400, 100, and 50 head) and four control strategies (no intervention, vaccination of breeding stock, testing all calves pre-breeding and culling of persistently infected calves, and both vaccination of adults and testing and culling of calves). When no control strategy was implemented the BVDV reproductive rate (R(E-PI)) of persistently infected calves (PIs), vertical transmission rate from cows to calves and the mortality rate of PIs were influential in the number of PIs produced in the herd. When a vaccination program alone was implemented the vaccine efficacy was influential in the number of PIs produced in the herd. All control strategies decreased the effects of BVDV on the herd at both 1 and 10 years compared to no control. In most cases the combination of adult vaccination and calf testing and culling resulted in the largest decrease in the both the median and 95% prediction interval for the range of effects from BVDV. The effect of control strategies was most apparent in the 400 head herds. All control strategies increased the probability of early clearance of PIs from the herd for all herd sizes. Fifty and 100 head herds cleared infection by 4 and 9 years respectively even without a control program but 400 head herds did not always clear infection after 10 years unless a testing program was implemented. The model presented is valuable in assessing the effect of control strategies and the effects of disease parameters on BVDV spread in beef herds.
Preventive Veterinary Medicine | 2009
Rebecca L. Smith; Michael W. Sanderson; David G. Renter; Robert L. Larson; Bradley J. White
A spreadsheet model using Monte Carlo simulation was designed to evaluate the introduction of bovine viral diarrhea virus (BVDV) to cow-calf farms and the effect of different testing strategies. Risks were modeled to include imports to the cow-calf herd and stocker calves imported to adjacent pastures. The number of persistently infected (PI) animals imported and the probability of BVDV introduction were monitored for three herd sizes, four import profiles, and six testing strategies. Importing stockers and importing pregnant heifers were the biggest risks for introduction of BVDV. Testing for PI animals in stockers decreased the risk they posed, but testing pregnant heifers was not sufficient to decrease risk unless their calves were also tested. Test sensitivity was more influential than PI prevalence on the likelihood of BVDV introduction, when all imports were tested. This model predicts the risk of BVDV introduction for individual herds based on management decisions, and should prove to be a useful tool to help cow-calf producers in controlling the risk of importing BVDV to a naïve herd.
Journal of Animal Science | 2015
Bradley J. White; David E. Amrine; Dan R. Goehl
Mitigation of the deleterious effects of bovine respiratory disease (BRD) is an important issue in the cattle industry. Conventional management of calves at high risk for BRD often includes mass treatment with antimicrobials at arrival followed by visual observation for individual clinical cases. These methods have proven effective; however, control program efficacy is influenced by the accuracy of visual observation. A remote early disease identification (REDI) system has been described that monitors cattle behavior to identify potential BRD cases. The objective of this research was to compare health and performance outcomes using either traditional BRD control (visual observation and metaphylaxis) or REDI during a 60-d postarrival phase in high-risk beef calves. The randomized controlled clinical trial was performed in 8 replicates at 3 different facilities over a 19-mo period. In each replicate, a single load of calves was randomly allocated to receive either conventional management (CONV; total = 8) or REDI (total = 8) as the method for BRD control. Cattle were monitored with each diagnostic method for the first 30 d on feed and performance variables were collected until approximately 60 d after arrival. Statistical differences ( < 0.10) were not identified in common performance (ADG) or health (morbidity, first treatment success, and mortality risk) among the treatment groups. Calves in the REDI pens had a lower ( < 0.01) average number of days on feed at first treatment (9.1 ± 1.2 d) compared with CONV pens (15.8 ± 1.2 d). There were no statistical differences ( > 0.10) in risk of BRD treatment and REDI calves were not administered antimicrobials at arrival; therefore, REDI calves had a lower ( < 0.01) average number of doses of antimicrobials/calf (0.75 ± 0.1 doses) compared with CONV calves (1.67 ± 0.1 doses). In this trial, the REDI system was comparable to conventional management with the potential advantages of earlier BRD diagnosis and decreased use of antimicrobials. Further research should be performed to evaluate the longer-term impacts of the 2 systems.