Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rachel M. Izard is active.

Publication


Featured researches published by Rachel M. Izard.


Journal of Bone and Mineral Research | 2011

Biological constraints that limit compensation of a common skeletal trait variant lead to inequivalence of tibial function among healthy young adults.

Karl J. Jepsen; Amanda Centi; G. Felipe Duarte; Kathleen Galloway; Haviva M. Goldman; Naomi Hampson; Joan M. Lappe; Diane M. Cullen; Julie Greeves; Rachel M. Izard; Bradley C. Nindl; William J. Kraemer; Charles Negus; Rachel K. Evans

Having a better understanding of how complex systems like bone compensate for the natural variation in bone width to establish mechanical function will benefit efforts to identify traits contributing to fracture risk. Using a collection of pQCT images of the tibial diaphysis from 696 young adult women and men, we tested the hypothesis that bone cells cannot surmount the nonlinear relationship between bone width and whole bone stiffness to establish functional equivalence across a healthy population. Intrinsic cellular constraints limited the degree of compensation, leading to functional inequivalence relative to robustness, with slender tibias being as much as two to three times less stiff relative to body size compared with robust tibias. Using Path Analysis, we identified a network of compensatory trait interactions that explained 79% of the variation in whole‐bone bending stiffness. Although slender tibias had significantly less cortical area relative to body size compared with robust tibias, it was the limited range in tissue modulus that was largely responsible for the functional inequivalence. Bone cells coordinately modulated mineralization as well as the cortical porosity associated with internal bone multicellular units (BMU)‐based remodeling to adjust tissue modulus to compensate for robustness. Although anecdotal evidence suggests that functional inequivalence is tolerated under normal loading conditions, our concern is that the functional deficit of slender tibias may contribute to fracture susceptibility under extreme loading conditions, such as intense exercise during military training or falls in the elderly. Thus, we show the natural variation in bone robustness was associated with predictable functional deficits that were attributable to cellular constraints limiting the amount of compensation permissible in human long bone. Whether these cellular constraints can be circumvented prophylactically to better equilibrate function among individuals remains to be determined.


Journal of Sports Sciences | 2008

An investigation of a novel three-dimensional activity monitor to predict free-living energy expenditure

James M. Carter; David M. Wilkinson; Sam D. Blacker; Mark P. Rayson; James Bilzon; Rachel M. Izard; Andy Coward; Antony Wright; Alan M. Nevill; Kirsten L. Rennie; Tracey McCaffrey; Barbara Livingstone

Abstract The aim of this study was to assess the capability of the 3dNX™ accelerometer to predict energy expenditure in two separate, free-living cohorts. Twenty-three adolescents and 14 young adults took a single dose of doubly labelled water and wore a 3dNX™ activity monitor during waking hours for a 10-day period while carrying out their normal routines. Multiple linear regression with backward elimination was used to establish the strength of the associations between various indices of energy expenditure, physical activity counts, and anthropometric variables. 3dNX™ output accounted for 27% and 35% of the variance in the total energy expenditure of the adolescent and young adult cohort, respectively. The explained variance increased to 78%, with a standard error of estimate of 7%, when 3dNX™ output was combined with body composition variables. The 3dNX™ accelerometer can be used to predict free-living daily energy expenditure with a standard error of estimate of 1.65 MJ in adolescents and 1.52 MJ in young adults. The inclusion of anthropometric variables reduces the error to approximately 1 MJ. Although it remains to cross-validate these models in other populations, early indications suggest that the 3dNX™ provides a useful method of predicting energy expenditure in free-living individuals.


Applied Physiology, Nutrition, and Metabolism | 2011

Effects of a daily mixed nutritional supplement on physical performance, body composition, and circulating anabolic hormones during 8 weeks of arduous military training.

Matthew B. Fortes; Bethany C. Diment; Julie P. Greeves; Anna Casey; Rachel M. Izard; Neil P. Walsh

The aim of this work was to investigate the effect of a daily mixed nutritional supplement upon body composition, physical performance, and circulating anabolic hormones in soldiers undergoing arduous training. Thirty males received either a habitual diet alone (CON, n = 15) or with the addition of a daily mixed supplement (SUP, n = 15) of ∼5.1 MJ·d⁻¹ during 8 weeks of training. Body composition (DEXA), maximal dynamic lift strength (MDLS), and vertical jump (VJ) were assessed, and resting blood samples were collected before and after training. Blood analysis included insulin-like growth factors (IGF-1, IGF BP-1, and IGF BP-3), testosterone, and cortisol. There were no group differences at baseline. Body mass loss (mean ± SD) (CON 5.0 ± 2.3, SUP 1.6 ± 1.5 kg), lean mass loss (CON 2.0 ± 1.5, SUP 0.7 ± 1.5 kg), and fat mass loss (CON 3.0 ± 1.6, SUP 0.9 ± 1.8 kg) were significantly blunted by SUP. CON experienced significant decrements in MDLS (14%), VJ (10%), and explosive leg power (11%) that were prevented by SUP. Military training significantly reduced circulating IGF-1 (28%), testosterone (19%), and the testosterone:cortisol ratio (24%) with no effect of SUP. Circulating IGF BP-1 concentration and cortisol remained unchanged throughout, although SUP abolished the significant decrease in circulating IGF BP-3 (20%) on CON. In conclusion, a daily mixed nutritional supplement attenuated decreases in body mass and lean mass and prevented the decrease in physical performance during an arduous military training program.


British Journal of Nutrition | 2014

Supplement use by UK-based British Army soldiers in training.

Anna Casey; Jason Hughes; Rachel M. Izard; Julie P. Greeves

The use of supplements is widespread at all levels of civilian sport and a prevalence of 60–90 % is reported among high-performance UK athletes, including juniors. The prevalence of supplement use among UK-based British Army personnel is not known. The aim of the present study was to establish the point prevalence of supplement use in UK-based British Army soldiers under training (SuTs) and associated staff. A cross-sectional anonymous survey was carried out in 3168 British Army SuTs and soldiers, equating to 3·1 % of regular Army strength, based at eleven Phase 1, 2 and 3 UK Army training sites. Overall, 38 % of the respondents reported current use of supplements, but prevalence varied according to the course attended by the respondents. The number of different supplements used was 4·7 (sd 2·9). Supplements most commonly used were protein bars, powders and drinks (66 %), isotonic carbohydrate–electrolyte sports drinks (49 %), creatine (38 %), recovery sports drinks (35 %), multivitamins (31 %) and vitamin C (25 %). A small proportion of respondents reported the use of amphetamines and similar compounds (1·6 %), cocaine (0·8 %), anabolic androgenic steroids (1·1 %), growth hormone (2·0 %), and other anabolic agents, e.g. testosterone (4·2 %). Logistic regression modelling indicated that, for current users, younger age, being female, smoking and undergoing Officer Cadet training were associated with greater supplement use. This is the first study to investigate the prevalence of dietary and training supplement use in UK-based British military personnel. Self-administration of a wide range of supplements is reported by British military personnel in training, which is at least as great as that reported by those on deployment, and has implications for Defence policy and educational needs.


Bone | 2016

Increased density and periosteal expansion of the tibia in young adult men following short-term arduous training

Rachel M. Izard; William D. Fraser; Charles Negus; Craig Sale; Julie P. Greeves

PURPOSE Few human studies have reported early structural adaptations of bone to weight-bearing exercise, which provide a greater contribution to improved bone strength than increased density. This prospective study examined site- and regional-specific adaptations of the tibia during arduous training in a cohort of male military (infantry) recruits to better understand how bone responds in vivo to mechanical loading. METHODS Tibial bone density and geometry were measured in 90 British Army male recruits (ages 21±3years, height: 1.78±0.06m, body mass: 73.9±9.8kg) in weeks 1 (Baseline) and 10 of initial military training. Scans were performed at the 4%, 14%, 38% and 66% sites, measured from the distal end plate, using pQCT (XCT2000L, Stratec Pforzheim, Germany). Customised software (BAMPack, L-3 ATI) was used to examine whole bone cross-section and regional sectors. T-tests determined significant differences between time points (P<0.05). RESULTS Bone density of trabecular and cortical compartments increased significantly at all measured sites. Bone geometry (cortical area and thickness) and bone strength (i, MMi and BSI) at the diaphyseal sites (38 and 66%) were also significantly higher in week 10. Regional changes in density and geometry were largely observed in the anterior, medial-anterior and anterior-posterior sectors. Calf muscle density and area (66% site) increased significantly at week 10 (P<0.01). CONCLUSIONS In vivo mechanical loading improves bone strength of the human tibia by increased density and periosteal expansion, which varies by site and region of the bone. These changes may occur in response to the nature and distribution of forces originating from bending, torsional and shear stresses of military training. These improvements are observed early in training when the osteogenic stimulus is sufficient, which may be close to the fracture threshold in some individuals.


BMJ open sport and exercise medicine | 2016

Low fitness, low body mass and prior injury predict injury risk during military recruit training: a prospective cohort study in the British Army

Mark Robinson; Andrew Siddall; James Bilzon; Dylan Thompson; Julie P. Greeves; Rachel M. Izard; Keith Stokes

Background Injuries sustained by military recruits during initial training impede training progression and military readiness while increasing financial costs. This study investigated training-related injuries and injury risk factors among British Army infantry recruits. Methods Recruits starting infantry training at the British Army Infantry Training Centre between September 2008 and March 2010 were eligible to take part. Information regarding lifestyle behaviours and injury history was collected using the Military Pre-training Questionnaire. Sociodemographic, anthropometric, physical fitness and injury (lower limb and lower back) data were obtained from Army databases. Univariable and multivariable Cox regression models were used to explore the association between time to first training injury and potential risk factors. Results 58% (95% CI 55% to 60%) of 1810 recruits sustained at least 1 injury during training. Overuse injuries were more common than traumatic injuries (65% and 35%, respectively). The lower leg accounted for 81% of all injuries, and non-specific soft tissue damage was the leading diagnosis (55% of all injuries). Injuries resulted in 122 (118 to 126) training days lost per 1000 person-days. Slower 2.4 km run time, low body mass, past injury and shin pain were independently associated with higher risk of any injury. Conclusions There was a high incidence of overuse injuries in British Army recruits undertaking infantry training. Recruits with lower pretraining fitness levels, low body mass and past injuries were at higher risk. Faster 2.4 km run time performance and minimal body mass standards should be considered for physical entry criteria.


BMJ open sport and exercise medicine | 2015

Force and acceleration characteristics of military foot drill: implications for injury risk in recruits

Patrick Carden; Rachel M. Izard; Julie P. Greeves; Jason P. Lake; Stephen D. Myers

Background Foot drill involving marching and drill manoeuvres is conducted regularly during basic military recruit training. Characterising the biomechanical loading of foot drill will improve our understanding of the contributory factors to lower limb overuse injuries in recruits. Aim Quantify and compare forces, loading rates and accelerations of British Army foot drill, within and between trained and untrained personnel. Methods 24 trained soldiers (12 men and 12 women; TRAINED) and 12 civilian men (UNTRAINED) performed marching and five drill manoeuvres on force platforms; motion capture recorded tibial position. Peak vertical impact force (PF), peak vertical loading rate (PLR), expressed as multiples of body weight (BW) and peak tibial impact acceleration (PTA) were recorded. Results Drill manoeuvre PF, PLR and PTA were similar, but higher in TRAINED men (PF, PLR: p<0.01; PTA: p<0.05). Peak values in TRAINED men were shown for the halt (mean (SD); PF: 6.5 (1.5) BW; PLR: 983 (333) BW/s PTA; PTA: 207 (57) m/s2) and left turn (PF: 6.6 (1.7) BW; PLR: 928 (300) BW/s; 184 (62) m/s2). Marching PF, PLR, PTA were similar between groups and lower than all drill manoeuvres (PF: 1.1–1.3 BW; PLR: 42–70 BW/s; p<0.01; PTA: 23–38 m/s2; p<0.05). Conclusions Army foot drill generates higher forces, loading rates and accelerations than activities such as running and load carriage, while marching is comparable to moderate running (10.8 km/h). The large biomechanical loading of foot drill may contribute to the high rate of overuse injuries during initial military training, and strategies to regulate/reduce this loading should be explored.


Military Medicine | 2018

Increased Risk of Upper Respiratory Infection in Military Recruits Who Report Sleeping Less Than 6 h per night

Laurel M. Wentz; Mark D. Ward; Claire Potter; Samuel J. Oliver; Sarah Jackson; Rachel M. Izard; Julie P. Greeves; Neil P. Walsh

Introduction Professional sleep associations recommend 7-9 h of sleep per night for young adults. Habitually sleeping less than 6 h per night has been shown to increase susceptibility to common cold in otherwise healthy, adult civilians. However, no investigations have examined the importance of sleep duration on upper respiratory tract infection (URTI) and loss of training days in military recruits. The purpose of this study was to describe self-reported sleep duration in a large cohort of military recruits and to assess the relationship between reported sleep duration and incidence of URTIs. We hypothesized that recruits who reported sleeping less than the recommended 7-9 h per night during training suffered a greater incidence of URTI and, as a consequence, lost more training days compared with recruits who met sleep recommendations. Materials and Methods Participants included 651 British Army recruits aged 22 ± 3 yr who completed 13 wk of basic military training (67% males, 33% females). Participants were members of 21 platoons (11 male, 10 female) who commenced training across four seasons (19% winter, 20% spring, 29% summer, and 32% autumn). At the start and completion of training, participants completed a questionnaire asking the typical time they went to sleep and awoke. Incidence of physician-diagnosed URTI and lost training days due to URTI were retrieved from medical records. Results Self-reported sleep duration decreased from before to during training (8.5 ± 1.6 vs. 7.0 ± 0.8 h; p < 0.01). Prior to training, 13% of participants reported sleeping less than the recommended 7 h sleep per night; however, this increased to 38% during training (X2 = 3.8; p = 0.05). Overall, 49 participants (8%) were diagnosed by a physician with at least one URTI and 3 participants (<1%) were diagnosed with two URTIs. After controlling for sex, body mass index, season of recruitment, smoking, and alcohol, participants who reported sleeping less than 6 h per night during training were four times more likely to be diagnosed with URTI compared with participants who slept 7-9 h per night in a logistic regression model (OR 4.4; 95% CI, 1.5-12.9, p < 0.01). On average, each URTI resulted in 2.9 ± 1.5 lost training days. Participants who were diagnosed with URTI had more overall lost training days for any illness compared with participants who did not report a URTI during basic military training (3.3 ± 1.9 vs. 0.4 ± 1.3; p < 0.01). Conclusion In a large population of British Army recruits, these findings show that more than one third of participants failed to meet sleep duration recommendations during training. Furthermore, those who reported sleeping less than 6 h per night were four times more likely to be diagnosed with an URTI and lost more training days due to URTI. Since sleep restriction is considered a necessary element of military training, future studies should examine interventions to reduce any negative effects on immunity and host defense.


Military Medicine | 2018

Estimates of tibial shock magnitude in men and women at the start and end of a military drill training programme

Hannah Rice; S Saunders; S McGuire; O Thomas; Rachel M. Izard

Introduction Foot drill is a key component of military training and is characterized by frequent heel stamping, likely resulting in high tibial shock magnitudes. Higher tibial shock during running has previously been associated with risk of lower limb stress fractures, which are prevalent among military populations. Quantification of tibial shock during drill training is, therefore, warranted. This study aimed to provide estimates of tibial shock during military drill in British Army Basic training. The study also aimed to compare values between men and women, and to identify any differences between the first and final sessions of training. Materials and Methods Tibial accelerometers were secured on the right medial, distal shank of 10 British Army recruits (n = 5 men; n = 5 women) throughout a scheduled drill training session in week 1 and week 12 of basic military training. Peak positive accelerations, the average magnitude above given thresholds, and the rate at which each threshold was exceeded were quantified. Results Mean (SD) peak positive acceleration was 20.8 (2.2) g across all sessions, which is considerably higher than values typically observed during high impact physical activity. Magnitudes of tibial shock were higher in men than women, and higher in week 12 compared with week 1 of training. Conclusions This study provides the first estimates of tibial shock magnitude during military drill training in the field. The high values suggest that military drill is a demanding activity and this should be considered when developing and evaluating military training programs. Further exploration is required to understand the response of the lower limb to military drill training and the etiology of these responses in the development of lower limb stress fractures.


Occupational Medicine | 2017

Smoking status and physical fitness during initial military training

Andrew Siddall; James Bilzon; Dylan Thompson; Julie P. Greeves; Rachel M. Izard; Keith Stokes

Background: Habitual smoking is prevalent in military populations, but whether smoking status influences physical fitness development during training is not clear. Aims: We investigated the effect of smoking status on physical fitness parameters during initial British Army Infantry training. Methods: Routine measures of physical fitness (2.4 km run time and maximum number of press ups and sit ups in two minutes) were obtained in 1,182 male recruits (mean ± SD: age 20 ± 3 y, body mass 70.6 ± 9.8 kg, height 1.77 ± 0.07 m; 58% smokers) at weeks 1, 14 and 24 of initial military training. A linear mixed model was used to identify differences in performance between smokers and nonsmokers over time. Results: Non-smokers performed significantly better than smokers in all performance tests (P<0.01), but rates of improvement during training were similar (P>0.05). Run performance improved by 7% in non-smokers (estimated marginal means with 95% confidence limits; 612 (608-616) s to 567 (562-572) s) and 8% in smokers (622 (619-625) s to 571 (568-575) s). Press up performance improved by 18% in non-smokers (48.3 (47.1-49.4) to 57.0 (55.6-58.3)) and 23% in smokers (44.1 (43.2-45.1) to 54.5 (53.3-55.6)) and sit up performance by 15% in non-smokers (57.3 (56.3-58.2) to 66.0 (64.9-67.2)) and 18% in smokers (53.8 (53.0-54.6) to 63.3 (62.3- 64.3)). Conclusions: Smokers exhibited lower muscular and cardiorespiratory endurance performance than non-smokers. Unexpectedly however, no significant differences in improvement in performance indices were demonstrated between smokers and non-smokers during military training.

Collaboration


Dive into the Rachel M. Izard's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sam D. Blacker

University of Chichester

View shared research outputs
Top Co-Authors

Avatar

Thomas J. O’Leary

United Kingdom Ministry of Defence

View shared research outputs
Top Co-Authors

Avatar

Mark P. Rayson

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Samantha Saunders

United Kingdom Ministry of Defence

View shared research outputs
Top Co-Authors

Avatar

Stephen McGuire

United Kingdom Ministry of Defence

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Antony Wright

MRC Human Nutrition Research

View shared research outputs
Researchain Logo
Decentralizing Knowledge