Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeffrey T. Howard is active.

Publication


Featured researches published by Jeffrey T. Howard.


JAMA Surgery | 2016

The Effect of a Golden Hour Policy on the Morbidity and Mortality of Combat Casualties

Russ S. Kotwal; Jeffrey T. Howard; Jean A. Orman; Bruce W. Tarpey; Jeffrey A. Bailey; Howard R. Champion; Robert L. Mabry; John B. Holcomb; Kirby R. Gross

IMPORTANCEnThe term golden hour was coined to encourage urgency of trauma care. In 2009, Secretary of Defense Robert M. Gates mandated prehospital helicopter transport of critically injured combat casualties in 60 minutes or less.nnnOBJECTIVESnTo compare morbidity and mortality outcomes for casualties before vs after the mandate and for those who underwent prehospital helicopter transport in 60 minutes or less vs more than 60 minutes.nnnDESIGN, SETTING, AND PARTICIPANTSnA retrospective descriptive analysis of battlefield data examined 21,089 US military casualties that occurred during the Afghanistan conflict from September 11, 2001, to March 31, 2014. Analysis was conducted from September 1, 2014, to January 21, 2015.nnnMAIN OUTCOMES AND MEASURESnData for all casualties were analyzed according to whether they occurred before or after the mandate. Detailed data for those who underwent prehospital helicopter transport were analyzed according to whether they occurred before or after the mandate and whether they occurred in 60 minutes or less vs more than 60 minutes. Casualties with minor wounds were excluded. Mortality and morbidity outcomes and treatment capability-related variables were compared.nnnRESULTSnFor the total casualty population, the percentage killed in action (16.0% [386 of 2411] vs 9.9% [964 of 9755]; Pu2009<u2009.001) and the case fatality rate ([CFR] 13.7 [469 of 3429] vs 7.6 [1344 of 17,660]; Pu2009<u2009.001) were higher before vs after the mandate, while the percentage died of wounds (4.1% [83 of 2025] vs 4.3% [380 of 8791]; Pu2009=u2009.71) remained unchanged. Decline in CFR after the mandate was associated with an increasing percentage of casualties transported in 60 minutes or less (regression coefficient, -0.141; Pu2009<u2009.001), with projected vs actual CFR equating to 359 lives saved. Among 4542 casualties (mean injury severity score, 17.3; mortality, 10.1% [457 of 4542]) with detailed data, there was a decrease in median transport time after the mandate (90 min vs 43 min; Pu2009<u2009.001) and an increase in missions achieving prehospital helicopter transport in 60 minutes or less (24.8% [181 of 731] vs 75.2% [2867 of 3811]; Pu2009<u2009.001). When adjusted for injury severity score and time period, the percentage killed in action was lower for those critically injured who received a blood transfusion (6.8% [40 of 589] vs 51.0% [249 of 488]; Pu2009<u2009.001) and were transported in 60 minutes or less (25.7% [205 of 799] vs 30.2% [84 of 278]; Pu2009<u2009.01), while the percentage died of wounds was lower among those critically injured initially treated by combat support hospitals (9.1% [48 of 530] vs 15.7% [86 of 547]; Pu2009<u2009.01). Acute morbidity was higher among those critically injured who were transported in 60 minutes or less (36.9% [295 of 799] vs 27.3% [76 of 278]; Pu2009<u2009.01), those severely and critically injured initially treated at combat support hospitals (severely injured, 51.1% [161 of 315] vs 33.1% [104 of 314]; Pu2009<u2009.001; and critically injured, 39.8% [211 of 530] vs 29.3% [160 of 547]; Pu2009<u2009.001), and casualties who received a blood transfusion (50.2% [618 of 1231] vs 3.7% [121 of 3311]; Pu2009<u2009.001), emphasizing the need for timely advanced treatment.nnnCONCLUSIONS AND RELEVANCEnA mandate made in 2009 by Secretary of Defense Gates reduced the time between combat injury and receiving definitive care. Prehospital transport time and treatment capability are important factors for casualty survival on the battlefield.


Shock | 2015

Individual-Specific, Beat-to-beat Trending of Significant Human Blood Loss: The Compensatory Reserve.

Victor A. Convertino; Jeffrey T. Howard; Carmen Hinojosa-Laborde; Sylvain Cardin; Paul B. Batchelder; Jane Mulligan; Gregory Z. Grudic; Steven L. Moulton; David B. MacLeod

ABSTRACT Current monitoring technologies are unable to detect early, compensatory changes that are associated with significant blood loss. We previously introduced a novel algorithm to calculate the Compensatory Reserve Index (CRI) based on the analysis of arterial waveform features obtained from photoplethysmogram recordings. In the present study, we hypothesized that the CRI would provide greater sensitivity and specificity to detect blood loss compared with traditional vital signs and other hemodynamic measures. Continuous noninvasive vital sign waveform data, including CRI, photoplethysmogram, heart rate, blood pressures, SpO2, cardiac output, and stroke volume, were analyzed from 20 subjects before, during, and after an average controlled voluntary hemorrhage of ∼1.2 L of blood. Compensatory Reserve Index decreased by 33% in a linear fashion across progressive blood volume loss, with no clinically significant alterations in vital signs. The receiver operating characteristic area under the curve for the CRI was 0.90, with a sensitivity of 0.80 and specificity of 0.76. In comparison, blood pressures, heart rate, SpO2, cardiac output, and stroke volume had significantly lower receiver operating characteristic area under the curve values and specificities for detecting the same volume of blood loss. Consistent with our hypothesis, CRI detected blood loss and restoration with significantly greater specificity than did other traditional physiologic measures. Single measurement of CRI may enable more accurate triage, whereas CRI monitoring may allow for earlier detection of casualty deterioration.


JAMA | 2017

Association of Prehospital Blood Product Transfusion During Medical Evacuation of Combat Casualties in Afghanistan With Acute and 30-Day Survival

Stacy Shackelford; Deborah J. del Junco; Nicole Powell-Dunford; Edward L. Mazuchowski; Jeffrey T. Howard; Russ S. Kotwal; Jennifer Gurney; Frank K. Butler; Kirby R. Gross; Zsolt T. Stockinger

Importance Prehospital blood product transfusion in trauma care remains controversial due to poor-quality evidence and cost. Sequential expansion of blood transfusion capability after 2012 to deployed military medical evacuation (MEDEVAC) units enabled a concurrent cohort study to focus on the timing as well as the location of the initial transfusion. Objective To examine the association of prehospital transfusion and time to initial transfusion with injury survival. Design, Setting, and Participants Retrospective cohort study of US military combat casualties in Afghanistan between April 1, 2012, and August 7, 2015. Eligible patients were rescued alive by MEDEVAC from point of injury with either (1) a traumatic limb amputation at or above the knee or elbow or (2) shock defined as a systolic blood pressure of less than 90 mm Hg or a heart rate greater than 120 beats per minute. Exposures Initiation of prehospital transfusion and time from MEDEVAC rescue to first transfusion, regardless of location (ie, prior to or during hospitalization). Transfusion recipients were compared with nonrecipients (unexposed) for whom transfusion was delayed or not given. Main Outcomes and Measures Mortality at 24 hours and 30 days after MEDEVAC rescue were coprimary outcomes. To balance injury severity, nonrecipients of prehospital transfusion were frequency matched to recipients by mechanism of injury, prehospital shock, severity of limb amputation, head injury, and torso hemorrhage. Cox regression was stratified by matched groups and also adjusted for age, injury year, transport team, tourniquet use, and time to MEDEVAC rescue. Results Of 502 patients (median age, 25 years [interquartile range, 22 to 29 years]; 98% male), 3 of 55 prehospital transfusion recipients (5%) and 85 of 447 nonrecipients (19%) died within 24 hours of MEDEVAC rescue (between-group difference, −14% [95% CI, −21% to −6%]; Pu2009=u2009.01). By day 30, 6 recipients (11%) and 102 nonrecipients (23%) died (between-group difference, −12% [95% CI, −21% to −2%]; Pu2009=u2009.04). For the 386 patients without missing covariate data among the 400 patients within the matched groups, the adjusted hazard ratio for mortality associated with prehospital transfusion was 0.26 (95% CI, 0.08 to 0.84, Pu2009=u2009.02) over 24 hours (3 deaths among 54 recipients vs 67 deaths among 332 matched nonrecipients) and 0.39 (95% CI, 0.16 to 0.92, Pu2009=u2009.03) over 30 days (6 vs 76 deaths, respectively). Time to initial transfusion, regardless of location (prehospital or during hospitalization), was associated with reduced 24-hour mortality only up to 15 minutes after MEDEVAC rescue (median, 36 minutes after injury; adjusted hazard ratio, 0.17 [95% CI, 0.04 to 0.73], Pu2009=u2009.02; there were 2 deaths among 62 recipients vs 68 deaths among 324 delayed transfusion recipients or nonrecipients). Conclusions and Relevance Among medically evacuated US military combat causalities in Afghanistan, blood product transfusion prehospital or within minutes of injury was associated with greater 24-hour and 30-day survival than delayed transfusion or no transfusion. The findings support prehospital transfusion in this setting.


Shock | 2015

Predictors of the Onset of Hemodynamic Decompensation During Progressive Central Hypovolemia: Comparison of the Peripheral Perfusion Index, Pulse Pressure Variability, and Compensatory Reserve Index.

Jud C. Janak; Jeffrey T. Howard; Goei Ka; Weber R; Muniz Gw; Carmen Hinojosa-Laborde; Victor A. Convertino

Introduction: As technological advances allow for the development of more sophisticated measurement of the mechanisms that contribute to compensation for loss of circulating blood volume such as hemorrhage, it is important to compare the discriminative ability of these new measures to standard vital signs and other new physiologic metrics of interest. The purpose of this study was to compare the discriminative ability of the following three measures to predict the onset of hemodynamic decompensation: peripheral perfusion index (PPI), pulse pressure variability (PPV), and the compensatory reserve index (CRI). Materials and Methods: There were 51 healthy participants who underwent a progressive simulated hemorrhage to induce central hypovolemia by lower body negative pressure (LBNP). The least-squares means and 95% confidence intervals for each measure were reported by LBNP level and stratified by tolerance status (high tolerance vs. low tolerance). Generalized estimating equations were used to perform repeated measures logistic regression analysis by regressing the onset of hemodynamic decompensation on each of the vital signs of interest. These probabilities were used to calculate sensitivity, specificity, and receiver-operating characteristic area under the curve (ROCAUC) for PPI, PPV, and CRI. Results: Compared with both PPV (ROCAUCu200a=u200a0.79) and PPI (0.56), the CRI (0.90) had superior discriminative ability (Pu200a⩽u200a0.0001) to predict the onset of hemodynamic decompensation. This included higher sensitivity (0.86 vs. 0.78 and 0.71) and specificity (0.78 vs. 0.69 and 0.29) for the CRI compared with PPV and PPI, respectively. Further, CRI was the only measure with mean predicted probabilities of the onset of hemodynamic decompensation that progressively increased as the level of simulated hemorrhage increased. Discussion: There are two potential rationales for why the CRI had superior discriminative ability to predict hemodynamic decompensation. First, the CRI more accurately predicted the onset of hemodynamic decompensation at all levels of simulated hemorrhage, but especially at lower levels of hemorrhage. Second, the CRI was better able to differentiate high versus low tolerant participants. Conclusion: Consistent with previous research, the CRI had superior discriminative ability to predict the onset of hemodynamic decompensation. For those patients at greatest risk for developing impending circulatory shock, identifying the most sensitive and specific measures of the onset of hemodynamic decompensation is critical for both the early recognition and implementation of life-saving interventions.


Circulation | 2015

Retrospective Analysis of Long-Term Outcomes After Combat Injury: A Hidden Cost of War.

Ian J. Stewart; Jonathan A. Sosnov; Jeffrey T. Howard; Jean A. Orman; Raymond Fang; Benjamin D. Morrow; David Zonies; Mary Bollinger; Caroline Tuman; Brett A. Freedman; Kevin K. Chung

Background— During the conflicts in Iraq and Afghanistan, 52u2009087 service members have been wounded in combat. The long-term sequelae of these injuries have not been carefully examined. We sought to determine the relation between markers of injury severity and the subsequent development of hypertension, coronary artery disease, diabetes mellitus, and chronic kidney disease. Methods and Results— Retrospective cohort study of critically injured US military personnel wounded in Iraq or Afghanistan from February 1, 2002 to February 1, 2011. Patients were then followed until January 18, 2013. Chronic disease outcomes were assessed by International Classification of Diseases, 9th edition codes and causes of death were confirmed by autopsy. From 6011 admissions, records were excluded because of missing data or if they were for an individual’s second admission. Patients with a disease diagnosis of interest before the injury date were also excluded, yielding a cohort of 3846 subjects for analysis. After adjustment for other factors, each 5-point increment in the injury severity score was associated with a 6%, 13%, 13%, and 15% increase in incidence rates of hypertension, coronary artery disease, diabetes mellitus, and chronic kidney disease, respectively. Acute kidney injury was associated with a 66% increase in rates of hypertension and nearly 5-fold increase in rates of chronic kidney disease. Conclusions— In Iraq and Afghanistan veterans, the severity of combat injury was associated with the subsequent development of hypertension, coronary artery disease, diabetes mellitus, and chronic kidney disease.


Experimental Biology and Medicine | 2017

The physiology of blood loss and shock: New insights from a human laboratory model of hemorrhage:

Alicia M. Schiller; Jeffrey T. Howard; Victor A. Convertino

The ability to quickly diagnose hemorrhagic shock is critical for favorable patient outcomes. Therefore, it is important to understand the time course and involvement of the various physiological mechanisms that are active during volume loss and that have the ability to stave off hemodynamic collapse. This review provides new insights about the physiology that underlies blood loss and shock in humans through the development of a simulated model of hemorrhage using lower body negative pressure. In this review, we present controlled experimental results through utilization of the lower body negative pressure human hemorrhage model that provide novel insights on the integration of physiological mechanisms critical to the compensation for volume loss. We provide data obtained from more than 250 human experiments to classify human subjects into two distinct groups: those who have a high tolerance and can compensate well for reduced central blood volume (e.g. hemorrhage) and those with low tolerance with poor capacity to compensate.We include the conceptual introduction of arterial pressure and cerebral blood flow oscillations, reflex-mediated autonomic and neuroendocrine responses, and respiration that function to protect adequate tissue oxygenation through adjustments in cardiac output and peripheral vascular resistance. Finally, unique time course data are presented that describe mechanistic events associated with the rapid onset of hemodynamic failure (i.e. decompensatory shock). Impact Statement Hemorrhage is the leading cause of death in both civilian and military trauma. The work submitted in this review is important because it advances the understanding of mechanisms that contribute to the total integrated physiological compensations for inadequate tissue oxygenation (i.e. shock) that arise from hemorrhage. Unlike an animal model, we introduce the utilization of lower body negative pressure as a noninvasive model that allows for the study of progressive reductions in central blood volume similar to those reported during actual hemorrhage in conscious humans to the onset of hemodynamic decompensation (i.e. early phase of decompensatory shock), and is repeatable in the same subject. Understanding the fundamental underlying physiology of human hemorrhage helps to test paradigms of critical care medicine, and identify and develop novel clinical practices and technologies for advanced diagnostics and therapeutics in patients with life-threatening blood loss.


Shock | 2016

Specificity of Compensatory Reserve and Tissue Oxygenation as Early Predictors of Tolerance to Progressive Reductions in Central Blood Volume.

Jeffrey T. Howard; Jud C. Janak; Carmen Hinojosa-Laborde; Victor A. Convertino

ABSTRACT We previously reported that measurements of muscle oxygen saturation (SmO2) and the compensatory reserve index (CRI) provided earlier indication of reduced central blood volume than standard vital signs (e.g., blood pressure, heart rate, arterial oxygen saturation). In the present study, we hypothesized that the CRI would provide greater sensitivity and specificity to detect progressive decrease in central circulating blood volume compared with SmO2. Continuous noninvasive measures of CRI (calculated from feature changes in the photoplethysmographic arterial waveforms) were collected from 55 healthy volunteer subjects before and during stepwise lower body negative pressure (LBNP) to the onset of hemodynamic decompensation. Near infrared spectroscopy was used on the forearm to obtain deep SmO2, hydrogen ion concentration ([H+]), and hemoglobin volume (HbT; decreases reflect vasoconstriction). CRI decreased by 97% in a linear fashion across progressive blood volume loss, with no clinically significant alterations in vital signs. The receiver operating characteristic (ROC) area under the curve (AUC) for the CRI was 0.91, with a sensitivity of 0.87 and specificity of 0.80, when predicting decompensation at progressive levels of LBNP. In comparison, SmO2, [H+], and HbT had significantly lower ROC AUC, sensitivity and specificity values for detecting the same outcome. Consistent with our hypothesis, CRI detected central hypovolemia with significantly greater specificity than measures of tissue metabolism. Single measurement of CRI may enable more accurate triage, while CRI monitoring may allow for earlier detection of casualty deterioration.


American Journal of Physiology-regulatory Integrative and Comparative Physiology | 2016

Comparison of compensatory reserve during lower-body negative pressure and hemorrhage in nonhuman primates.

Carmen Hinojosa-Laborde; Jeffrey T. Howard; Jane Mulligan; Greg Grudic; Victor A. Convertino

Compensatory reserve was measured in baboons (n = 13) during hemorrhage (Hem) and lower-body negative pressure (LBNP) using a machine-learning algorithm developed to estimate compensatory reserve by detecting reductions in central blood volume during LBNP. The algorithm calculates compensatory reserve index (CRI) from normovolemia (CRI = 1) to cardiovascular decompensation (CRI = 0). The hypothesis was that Hem and LBNP will elicit similar CRI values and that CRI would have higher specificity than stroke volume (SV) in predicting decompensation. Blood was removed in four steps: 6.25%, 12.5%, 18.75%, and 25% of total blood volume. Four weeks after Hem, the same animals were subjected to four levels of LBNP that was matched on the basis of their central venous pressure. Data (mean ± 95% confidence interval) indicate that CRI decreased (P < 0.001) from baseline during Hem (0.69 ± 0.10, 0.57 ± 0.09, 0.36 ± 0.10, 0.16 ± 0.08, and 0.08 ± 0.03) and LBNP (0.76 ± 0.05, 0.66 ± 0.08, 0.36 ± 0.13, 0.23 ± 0.11, and 0.14 ± 0.09). CRI was not different between Hem and LBNP (P = 0.20). Linear regression analysis between Hem CRI and LBNP CRI revealed a slope of 1.03 and a correlation coefficient of 0.96. CRI exhibited greater specificity than SV in both Hem (92.3 vs. 82.1) and LBNP (94.8 vs. 83.1) and greater ROC AUC in Hem (0.94 vs. 0.84) and LBNP (0.94 vs. 0.92). These data support the hypothesis that Hem and LBNP elicited the same CRI response, suggesting that measurement of compensatory reserve is superior to SV as a predictor of cardiovascular decompensation.


Journal of Trauma-injury Infection and Critical Care | 2018

Reexamination of a Battlefield Trauma Golden Hour Policy

Jeffrey T. Howard; Russ S. Kotwal; Alexis R. Santos-Lazada; Matthew J. Martin; Zsolt T. Stockinger

BACKGROUNDnMost combat casualties who die, do so in the prehospital setting. Efforts directed toward alleviating prehospital combat trauma death, known as killed in action (KIA) mortality, have the greatest opportunity for eliminating preventable death.nnnMETHODSnFour thousand five hundred forty-two military casualties injured in Afghanistan from September 11, 2001, to March 31, 2014, were included in this retrospective analysis to evaluate proposed explanations for observed KIA reduction after a mandate by Secretary of Defense Robert M. Gates that transport of injured service members occur within 60 minutes. Using inverse probability weighting to account for selection bias, data were analyzed using multivariable logistic regression and simulation analysis to estimate the effects of (1) gradual improvement, (2) damage control resuscitation, (3) harm from inadequate resources, (4) change in wound pattern, and (5) transport time on KIA mortality.nnnRESULTSnThe effect of gradual improvement measured as a time trend was not significant (adjusted odds ratio [AOR], 0.99; 95% confidence interval [CI], 0.94-1.03; p = 0.58). For casualties with military Injury Severity Score of 25 or higher, the odds of KIA mortality were 83% lower for casualties who needed and received prehospital blood transfusion (AOR, 0.17; 95% CI, 0.06-0.51; p = 0.002); 33% lower for casualties receiving initial treatment by forward surgical teams (AOR, 0.67; 95% CI, 0.58-0.78; p < 0.001); 70%, 74%, and 87% lower for casualties with dominant injuries to head (AOR, 0.30; 95% CI, 0.23-0.38; p < 0.001), abdomen (AOR, 0.26, 95% CI, 0.19-0.36; p < 0.001) and extremities (AOR, 0.13; 95% CI, 0.09-0.17; p < 0.001); 35% lower for casualties categorized with blunt injuries (AOR, 0.65; 95% CI, 0.46-0.92; p = 0.01); and 39% lower for casualties transported within one hour (AOR, 0.61; 95% CI, 0.51-0.74; p < 0.001). Results of simulations in which transport times had not changed after the mandate indicate that KIA mortality would have been 1.4% higher than observed, equating to 135 more KIA deaths (95% CI, 105-164).nnnCONCLUSIONnReduction in KIA mortality is associated with early treatment capabilities, blunt mechanism, select body locations of injury, and rapid transport.nnnLEVEL OF EVIDENCEnTherapy, level III.BACKGROUND Most combat casualties who die, do so in the prehospital setting. Efforts directed toward alleviating prehospital combat trauma death, known as killed in action (KIA) mortality, have the greatest opportunity for eliminating preventable death. METHODS Four thousand five hundred forty-two military casualties injured in Afghanistan from September 11, 2001, to March 31, 2014, were included in this retrospective analysis to evaluate proposed explanations for observed KIA reduction after a mandate by Secretary of Defense Robert M. Gates that transport of injured service members occur within 60 minutes. Using inverse probability weighting to account for selection bias, data were analyzed using multivariable logistic regression and simulation analysis to estimate the effects of (1) gradual improvement, (2) damage control resuscitation, (3) harm from inadequate resources, (4) change in wound pattern, and (5) transport time on KIA mortality. RESULTS The effect of gradual improvement measured as a time trend was not significant (adjusted odds ratio [AOR], 0.99; 95% confidence interval [CI], 0.94–1.03; p = 0.58). For casualties with military Injury Severity Score of 25 or higher, the odds of KIA mortality were 83% lower for casualties who needed and received prehospital blood transfusion (AOR, 0.17; 95% CI, 0.06–0.51; p = 0.002); 33% lower for casualties receiving initial treatment by forward surgical teams (AOR, 0.67; 95% CI, 0.58–0.78; p < 0.001); 70%, 74%, and 87% lower for casualties with dominant injuries to head (AOR, 0.30; 95% CI, 0.23–0.38; p < 0.001), abdomen (AOR, 0.26, 95% CI, 0.19–0.36; p < 0.001) and extremities (AOR, 0.13; 95% CI, 0.09–0.17; p < 0.001); 35% lower for casualties categorized with blunt injuries (AOR, 0.65; 95% CI, 0.46–0.92; p = 0.01); and 39% lower for casualties transported within one hour (AOR, 0.61; 95% CI, 0.51–0.74; p < 0.001). Results of simulations in which transport times had not changed after the mandate indicate that KIA mortality would have been 1.4% higher than observed, equating to 135 more KIA deaths (95% CI, 105–164). CONCLUSION Reduction in KIA mortality is associated with early treatment capabilities, blunt mechanism, select body locations of injury, and rapid transport. LEVEL OF EVIDENCE Therapy, level III.


Journal of Trauma-injury Infection and Critical Care | 2016

Rhabdomyolysis among critically ill combat casualties: Associations with acute kidney injury and mortality.

Ian J. Stewart; Tarra I. Faulk; Jonathan A. Sosnov; Michael S. Clemens; Joel Elterman; James D. Ross; Jeffrey T. Howard; Raymond Fang; David Zonies; Kevin K. Chung

BACKGROUND Rhabdomyolysis has been associated with poor outcomes in patients with traumatic injury, especially in the setting of acute kidney injury (AKI). However, rhabdomyolysis has not been systematically examined in a large cohort of combat casualties injured in the wars in Iraq and Afghanistan. METHODS We conducted a retrospective study of casualties injured during combat operations in Iraq and Afghanistan who were initially admitted to the intensive care unit from February 1, 2002, to February 1, 2011. Information on age, sex, Abbreviated Injury Scale (AIS) score, Injury Severity Score (ISS), mechanism of injury, shock index, creatine kinase, and serum creatinine were collected. These variables were examined via multivariate logistic and Cox regression analyses to determine factors independently associated with rhabdomyolysis, AKI, and death. RESULTS Of 6,011 admissions identified, a total of 2,109 patients met inclusion criteria and were included for analysis. Rhabdomyolysis, defined as creatine kinase greater than 5,000 U/L, was present in 656 subjects (31.1%). Risk factors for rhabdomyolysis identified on multivariable analysis included injuries to the abdomen and extremities, increased ISS, male sex, explosive mechanism of injury, and shock index greater than 0.9. After adjustment, patients with rhabdomyolysis had a greater than twofold increase in the odds of AKI. In the analysis for mortality, rhabdomyolysis was significantly associated with death until AKI was added, at which point it lost statistical significance. CONCLUSION We found that rhabdomyolysis is associated with the development of AKI in combat casualties. While rhabdomyolysis was strongly associated with mortality on the univariate model and in conjunction with both ISS and age, it was not associated with mortality after the inclusion of AKI. This suggests that the effect of rhabdomyolysis on mortality may be mediated by AKI. LEVEL OF EVIDENCE Prognostic and epidemiologic study, level III.

Collaboration


Dive into the Jeffrey T. Howard's collaboration.

Top Co-Authors

Avatar

Ian J. Stewart

Uniformed Services University of the Health Sciences

View shared research outputs
Top Co-Authors

Avatar

Kevin K. Chung

Uniformed Services University of the Health Sciences

View shared research outputs
Top Co-Authors

Avatar

Jonathan A. Sosnov

Uniformed Services University of the Health Sciences

View shared research outputs
Top Co-Authors

Avatar

Benjamin D. Morrow

Uniformed Services University of the Health Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brett A. Freedman

Landstuhl Regional Medical Center

View shared research outputs
Top Co-Authors

Avatar

Carmen Hinojosa-Laborde

University of Texas Health Science Center at San Antonio

View shared research outputs
Top Co-Authors

Avatar

Caroline Tuman

Landstuhl Regional Medical Center

View shared research outputs
Top Co-Authors

Avatar

Russ S. Kotwal

Uniformed Services University of the Health Sciences

View shared research outputs
Researchain Logo
Decentralizing Knowledge