Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jordan A. Weinberg is active.

Publication


Featured researches published by Jordan A. Weinberg.


JAMA | 2015

Transfusion of Plasma, Platelets, and Red Blood Cells in a 1:1:1 vs a 1:1:2 Ratio and Mortality in Patients With Severe Trauma: The PROPPR Randomized Clinical Trial

John B. Holcomb; Barbara C. Tilley; Sarah Baraniuk; Erin E. Fox; Charles E. Wade; Jeanette M. Podbielski; Deborah J. del Junco; Karen J. Brasel; Eileen M. Bulger; Rachael A. Callcut; Mitchell J. Cohen; Bryan A. Cotton; Timothy C. Fabian; Kenji Inaba; Jeffrey D. Kerby; Peter Muskat; Terence O’Keeffe; Sandro Rizoli; Bryce R.H. Robinson; Thomas M. Scalea; Martin A. Schreiber; Deborah M. Stein; Jordan A. Weinberg; Jeannie Callum; John R. Hess; Nena Matijevic; Christopher N. Miller; Jean-Francois Pittet; David B. Hoyt; Gail D. Pearson

IMPORTANCE Severely injured patients experiencing hemorrhagic shock often require massive transfusion. Earlier transfusion with higher blood product ratios (plasma, platelets, and red blood cells), defined as damage control resuscitation, has been associated with improved outcomes; however, there have been no large multicenter clinical trials. OBJECTIVE To determine the effectiveness and safety of transfusing patients with severe trauma and major bleeding using plasma, platelets, and red blood cells in a 1:1:1 ratio compared with a 1:1:2 ratio. DESIGN, SETTING, AND PARTICIPANTS Pragmatic, phase 3, multisite, randomized clinical trial of 680 severely injured patients who arrived at 1 of 12 level I trauma centers in North America directly from the scene and were predicted to require massive transfusion between August 2012 and December 2013. INTERVENTIONS Blood product ratios of 1:1:1 (338 patients) vs 1:1:2 (342 patients) during active resuscitation in addition to all local standard-of-care interventions (uncontrolled). MAIN OUTCOMES AND MEASURES Primary outcomes were 24-hour and 30-day all-cause mortality. Prespecified ancillary outcomes included time to hemostasis, blood product volumes transfused, complications, incidence of surgical procedures, and functional status. RESULTS No significant differences were detected in mortality at 24 hours (12.7% in 1:1:1 group vs 17.0% in 1:1:2 group; difference, -4.2% [95% CI, -9.6% to 1.1%]; P = .12) or at 30 days (22.4% vs 26.1%, respectively; difference, -3.7% [95% CI, -10.2% to 2.7%]; P = .26). Exsanguination, which was the predominant cause of death within the first 24 hours, was significantly decreased in the 1:1:1 group (9.2% vs 14.6% in 1:1:2 group; difference, -5.4% [95% CI, -10.4% to -0.5%]; P = .03). More patients in the 1:1:1 group achieved hemostasis than in the 1:1:2 group (86% vs 78%, respectively; P = .006). Despite the 1:1:1 group receiving more plasma (median of 7 U vs 5 U, P < .001) and platelets (12 U vs 6 U, P < .001) and similar amounts of red blood cells (9 U) over the first 24 hours, no differences between the 2 groups were found for the 23 prespecified complications, including acute respiratory distress syndrome, multiple organ failure, venous thromboembolism, sepsis, and transfusion-related complications. CONCLUSIONS AND RELEVANCE Among patients with severe trauma and major bleeding, early administration of plasma, platelets, and red blood cells in a 1:1:1 ratio compared with a 1:1:2 ratio did not result in significant differences in mortality at 24 hours or at 30 days. However, more patients in the 1:1:1 group achieved hemostasis and fewer experienced death due to exsanguination by 24 hours. Even though there was an increased use of plasma and platelets transfused in the 1:1:1 group, no other safety differences were identified between the 2 groups. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT01545232.


Journal of Trauma-injury Infection and Critical Care | 2009

The relationship of blood product ratio to mortality: survival benefit or survival bias?

Christopher W. Snyder; Jordan A. Weinberg; Gerald McGwin; Sherry M. Melton; Richard L. George; Donald A. Reiff; James M. Cross; Jennifer Hubbard-Brown; Loring W. Rue; Jeffrey D. Kerby

BACKGROUND Recent studies show an apparent survival advantage associated with the administration of higher cumulative ratios of fresh frozen plasma (FFP) to packed red blood cells (PRBC). It remains unclear how temporal factors and survival bias may influence these results. The objective of this study was to evaluate the temporal relationship between blood product ratios and mortality in massively transfused trauma patients. METHODS Patients requiring massive transfusion (>10 units of PRBC within 24 hours of admission) between 2005 and 2007 were identified (n = 134). In-hospital mortality was compared between patients receiving high (>1:2) versus low (<1:2) FFP:PRBC ratios with a regression model, using the FFP:PRBC ratio as a fixed value at 24 hours (method I) and as a time-varying covariate (method II). RESULTS The FFP:PRBC ratio for all patients was low early and increased over time. Sixty-eight percent of total blood products were given and 54% of deaths occurred during the first 6 hours. Using method I, patients receiving a high FFP:PRBC ratio (mean, 1:1.3) by 24 hours had a 63% lower risk of death (RR, 0.37; 95% CI, 0.22-0.64) compared with those receiving a low ratio (mean, 1:3.7). However, this association was no longer statistically significant (RR, 0.84; 95% CI, 0.47-1.50) when the timing of component product transfusion was taken into account (method II). CONCLUSIONS Similar to previous studies, an association between higher FFP:PRBC ratios at 24 hours and improved survival was observed. However, after adjustment for survival bias in the analysis, the association was no longer statistically significant. Prospective trials are necessary to evaluate whether hemostatic resuscitation is clinically beneficial.


Journal of Trauma-injury Infection and Critical Care | 2008

Transfusions in the less severely injured: does age of transfused blood affect outcomes?

Jordan A. Weinberg; Gerald McGwin; Marisa B. Marques; Samuel A. Cherry; Donald A. Reiff; Jeffrey D. Kerby; Loring W. Rue

BACKGROUND Prior studies have demonstrated that transfusion of older stored blood is associated with an increased risk of multiple organ failure, infection, and death. These reports were primarily comprised of severely injured patients, and it remains unknown whether this phenomenon is observed in relatively less injured patients. The purpose of this study was to evaluate the association between the age of stored blood and the morbidity and mortality in a mild to moderately injured patient cohort. METHODS Blunt trauma patients with Injury Severity Score <25 admitted to a Trauma Intensive Care Unit during 7.5 years who received no blood during the first 48 hours of hospitalization were selected for inclusion. Patients who died within 48 hours of admission were excluded from analysis. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated for the association between morbidity or mortality and the age and amount of blood transfused (>48 hours postadmission), adjusted for age, sex, injury severity, thoracic injury, mechanical ventilation, and transfusion volume. RESULTS During 7.5 years, 1,624 patients met the study criteria. The mean Injury Severity Score was 14.4. Receipt of blood stored beyond 2 weeks was associated with mortality (OR 1.12 [CI 1.02-1.23]), renal failure (OR 1.18 [CI 1.07-1.29]), and pneumonia (OR 1.10 [CI 1.04-1.17]). No such associations were identified, however, concerning the transfusion of blood with a lesser storage age. CONCLUSION In a mild to moderately injured intensive care unit patient cohort, the receipt of blood stored beyond 2 weeks was independently associated with mortality, renal failure, and pneumonia. The deleterious effect of older blood on patient outcome does not appear to be limited to the severely injured.


Journal of Trauma-injury Infection and Critical Care | 2011

Identifying risk for massive transfusion in the relatively normotensive patient: utility of the prehospital shock index.

Marianne J. Vandromme; Russell Griffin; Jeffrey D. Kerby; Gerald McGwin; Loring W. Rue; Jordan A. Weinberg

BACKGROUND In the prehospital environment, the failure of medical providers to recognize latent physiologic derangement in patients with compensated shock may risk undertriage. We hypothesized that the shock index (SI; heart rate divided by systolic blood pressure [SBP]), when used in the prehospital setting, could facilitate the identification of such patients. The objective of this study was to assess the association between the prehospital SI and the risk of massive transfusion (MT) in relatively normotensive blunt trauma patients. METHODS Admissions to a Level I trauma center between January 2000 and October 2008 with blunt mechanism of injury and prehospital SBP>90 mm Hg were identified. Patients were categorized by SI, calculated for each patient from prehospital vital signs. Risk ratios (RRs) and 95% confidence intervals (CI) for requiring MT (>10 red blood cell units within 24 hours of admission) were calculated using SI>0.5 to 0.7 (normal range) as the referent for all comparisons. RESULTS A total of 8,111 patients were identified, of whom 276 (3.4%) received MT. Compared with patients with normal SI, there was no significant increased risk for MT for patients with a SI of ≤0.5 (RR, 1.41; 95% CI, 0.90-2.21) or>0.7 to 0.9 (RR, 1.06; 95% CI, 0.77-1.45). However, a significantly increased risk for MT was observed for patients with SI>0.9. Specifically, patients with SI>0.9 to 1.1 were observed to have a 1.5-fold increased risk for MT (RR, 1.61; 95% CI, 1.13-2.31). Further increases in SI were associated with incrementally higher risks for MT, with an more than fivefold increase in patients with SI>1.1 to 1.3 (RR, 5.57; 95% CI, 3.74-8.30) and an eightfold risk in patients with SI>1.3 (RR, 8.13; 95% CI, 4.60-14.36). CONCLUSION Prehospital SI>0.9 identifies patients at risk for MT who would otherwise be considered relatively normotensive under current prehospital triage protocols. The risk for MT rises substantially with elevation of SI above this level. Further evaluation of SI in the context of trauma system triage protocols is warranted to analyze whether it triage precision might be augmented among blunt trauma patients with SBP>90 mm Hg.


Journal of The American College of Surgeons | 2010

Lactate Is a Better Predictor than Systolic Blood Pressure for Determining Blood Requirement and Mortality: Could Prehospital Measures Improve Trauma Triage?

Marianne J. Vandromme; Russell Griffin; Jordan A. Weinberg; Loring W. Rue; Jeffrey D. Kerby

BACKGROUND Standard hemodynamic evaluation of patients in shock may underestimate severity of hemorrhage given physiologic compensation. Blood lactate (BL) is an important adjunct in characterizing shock, and point-of-care devices are currently available for use in the prehospital (PH) setting. The objective of this study was to determine if BL levels have better predictive value when compared with systolic blood pressure (SBP) for identifying patients with an elevated risk of significant transfusion and mortality in a hemodynamically indeterminant cohort. STUDY DESIGN We selected trauma patients admitted to a level I trauma center over a 9-year period with SBP between 90 and 110 mmHg. The predictive capability of initial emergency department (ED) BL for needing > or =6 units packed RBCs within 24 hours postinjury and mortality was compared with PH-SBP and ED-SBP by comparing estimated area under the receiver operator curve (AUC). RESULTS We identified 2,413 patients with ED-SBP and 787 patients with PH-SBP and ED-BL. ED-BL was statistically better than PH-SBP (p = 0.0025) and ED-SBP (p < 0.0001) in predicting patients who will need > or = 6 U packed RBCs within 24 hours postinjury (AUC: ED-BL, 0.72 vs PH-SBP, 0.61; ED-BL, 0.76 vs ED-SBP, 0.60). ED-BL was also a better predictor than both PH-SBP (p = 0.0235) and ED-SBP (p < 0.0001) for mortality (AUC: ED-BL, 0.74 vs PH-SBP, 0.60; ED-BL, 0.76 vs ED-SBP, 0.61). CONCLUSIONS ED-BL is a better predictor than SBP in identifying patients requiring significant transfusion and mortality in this cohort with indeterminant SBP. These findings suggest that point-of-care BL measurements could improve trauma triage and better identify patients for enrollment in interventional trials. Further studies using BL measurement in the PH environment are warranted.


Journal of Trauma-injury Infection and Critical Care | 2010

Duration of red cell storage influences mortality after trauma.

Jordan A. Weinberg; Gerald McGwin; Marianne J. Vandromme; Marisa B. Marques; Sherry M. Melton; Donald A. Reiff; Jeffrey D. Kerby; Loring W. Rue

BACKGROUND Although previous studies have identified an association between the transfusion of relatively older red blood cells (RBCs) (storage ≥ 14 days) and adverse outcomes, they are difficult to interpret because the majority of patients received a combination of old and fresh RBC units. To overcome this limitation, we compared in-hospital mortality among patients who received exclusively old versus fresh RBC units during the first 24 hours of hospitalization. METHODS Patients admitted to a Level I trauma center between January 2000 and May 2009 who received ≥ 1 unit of exclusively old (≥ 14 days) vs. fresh (< 14 days) RBCs during the first 24 hours of hospitalization were identified. Risk ratios (RRs) and 95% confidence intervals (CIs) were calculated for the association between mortality and RBC age, adjusted for patient age, Injury Severity Score, gender, receipt of fresh frozen plasma or platelets, RBC volume, brain injury, and injury mechanism (blunt or penetrating). RESULTS One thousand six hundred forty-seven patients met the study inclusion criteria. Among patients who were transfused 1 or 2 RBC units, no difference in mortality with respect to RBC age was identified (adjusted RR, 0.97; 95% CI, 0.72-1.32). Among patients who were transfused 3 or more RBC units, receipt of old versus fresh RBCs was associated with a significantly increased risk of mortality, with an adjusted RR of 1.57 (95% CI, 1.14-2.15). No difference was observed concerning the mean number of old versus fresh units transfused to patients who received 3 or more units (6.05 vs. 5.47, respectively; p = 0.11). CONCLUSION In trauma patients undergoing transfusion of 3 or more RBC units within 24 hour of hospital arrival, receipt of relatively older blood was associated with a significantly increased mortality risk. Reservation of relatively fresh RBC units for the acutely injured may be advisable.


Scandinavian Journal of Trauma, Resuscitation and Emergency Medicine | 2009

Blood transfusion in the critically ill: does storage age matter?

Marianne J. Vandromme; Gerald McGwin; Jordan A. Weinberg

Morphologic and biochemical changes occur during red cell storage prior to product expiry, and these changes may hinder erythrocyte viability and function following transfusion. Despite a relatively large body of literature detailing the metabolic and structural deterioration that occurs during red cell storage, evidence for a significant detrimental clinical effect related to the transfusion of older blood is relatively less conclusive, limited primarily to observations in retrospective studies. Nonetheless, the implication that the transfusion of old, but not outdated blood may have negative clinical consequences demands attention. In this report, the current understanding of the biochemical and structural changes that occur during storage, known collectively as the storage lesion, is described, and the clinical evidence concerning the detrimental consequences associated with the transfusion of relatively older red cells is critically reviewed. Although the growing body of literature demonstrating the deleterious effects of relatively old blood is compelling, it is notable that all of these reports have been retrospective, and most of these studies have evaluated patients who received a mixture of red cell units of varying storage age. Until prospective studies have been completed and produce confirmative results, it would be premature to recommend any modification of current transfusion practice regarding storage age.In 1917, Frances Payton Rous and J.R. Turner identified that a citrate-glucose solution allowed for the preservation of a whole blood unit for up to five days, thus facilitating the formative practice of blood banking[1]. Later, Loutit and Mollison of Great Britain developed the first anticoagulant of the modern era, known as acid-citrate-dextrose (ACD)[1]. ACD extended the shelf life of refrigerated blood to 21 days, and ACD remained in wide spread usage until the 1960s, when it was replaced by citrate-phosphate-dextrose (CPD) and citrate-phosphate-dextrose-adenine (CPDA) solutions that increased shelf life to 35 days and 42 days respectively. More recently, additive solutions containing saline, adenine, and dextrose have been developed to augment red cell survival following transfusion, although without any direct increase in storage duration[1, 2].It is now well appreciated, however, that a number of morphologic and biochemical changes occur during red cell storage prior to product expiry, and these changes may hinder erythrocyte viability and function following transfusion. Despite a relatively large body of literature detailing the metabolic and structural deterioration that occurs during red cell storage, evidence for a significant detrimental clinical effect related to the transfusion of older blood is relatively less conclusive, limited primarily to observations in retrospective studies. Nonetheless, the implication that the transfusion of old, but not outdated blood may have negative clinical consequences demands attention. The purpose of this report is to describe the current understanding of the biochemical and structural changes that occur during storage, known collectively as the storage lesion, and to critically review the clinical evidence concerning the detrimental consequences associated with the transfusion of relatively older red cells.


Journal of Trauma-injury Infection and Critical Care | 2008

Comparison of intravenous ethanol versus diazepam for alcohol withdrawal prophylaxis in the trauma ICU: Results of a randomized trial

Jordan A. Weinberg; Louis J. Magnotti; Peter E. Fischer; Norma M. Edwards; Thomas J. Schroeppel; Timothy C. Fabian; Martin A. Croce

BACKGROUND Although benzodiazepines are the recommended first-line therapy for the prevention of alcohol withdrawal syndrome (AWS), the administration of intravenous ethanol as an alternative prophylactic agent persists in many surgical ICUs. Advocates of this therapy argue that ethanol provides effective prophylaxis against AWS without the excessive sedation observed with benzodiazepine therapy. No study to date, however, has compared the two therapies with regard to their sedative effects. The purpose of this study was to prospectively evaluate the efficacy of intravenous ethanol compared with benzodiazepines for the prevention of AWS with particular emphasis on the sedative effects of each therapy. METHODS During a 15-month period, trauma patients admitted to the ICU with a history of chronic daily alcohol consumption greater than or equal to five beverage equivalents per day were prospectively randomized to one of two 4-day prophylactic regimens: intravenous ethanol infusion (EtOH) versus scheduled-dose diazepam (BENZO). Patients were evaluated with the Riker sedation-agitation scale, a 7-point instrument for the subjective assessment of both sedation (1 = unarousable) and agitation (7 = dangerous agitation). According to protocol, regimens were titrated to achieve and maintain a Riker score of 4 (calm and cooperative). Deviation from a score of 4 during the course of treatment was compared between groups. RESULTS Fifty patients met study criteria and were randomized after obtainment of informed consent (EtOH, n = 26; BENZO, n = 24). Overall, the EtOH group had a significantly greater proportion of patients who deviated from a score of 4 during the course of treatment (p = 0.020). In both groups, the majority of deviation from a score of 4 reflected periods of under-sedation rather than over-sedation. One patient in the EtOH group failed treatment, requiring diazepam and haloperidol for control of AWS symptoms as per protocol, whereas no patient in the BENZO group failed treatment (p = NS). CONCLUSION Concerning the prophylaxis of AWS, intravenous ethanol offers no advantage over diazepam with respect to efficacy or adverse sedative effects. The purported benefit of intravenous ethanol as a prophylactic agent against AWS was not evident.


Journal of Trauma-injury Infection and Critical Care | 2008

The evolution of blunt splenic injury: resolution and progression.

Stephanie A. Savage; Ben L. Zarzaur; Louis J. Magnotti; Jordan A. Weinberg; George O. Maish; Tiffany K. Bee; Gayle Minard; Thomas J. Schroeppel; Martin A. Croce; Timothy C. Fabian

BACKGROUND Nonoperative management of blunt splenic injury (BSI) has become the standard of care for hemodynamically stable patients. Successful nonoperative management raises two related questions: (1) what is the time course for splenic healing and (2) when may patients safely return to usual activities? There is little evidence to guide surgeon recommendations regarding return to full activities. Our hypothesis was that time to healing is related to severity of BSI. METHODS The trauma registry at a level I trauma center was queried for patients diagnosed with a BSI managed nonoperatively between 2002 and 2007. Follow-up abdominal computed tomography scans were reviewed with attention to progression to healing of BSI. Kaplan-Meier curves were compared for mild (American Association for the Surgery of Trauma grades I-II) and severe (grades III-V) BSI. RESULTS Six hundred thirty-seven patients (63.9% mild spleen injury and 36.1% severe injury) with a BSI were eligible for analysis. Fifty-one patients had documented healing as inpatients. Ninety-seven patients discharged with BSI had outpatient computed tomography scans. Nine had worsening of BSI as outpatients and two (1 mild and 1 severe) required intervention (2 splenectomies). Thirty-three outpatients were followed to complete healing. Mild injuries had faster mean time to healing compared with severe (12.5 vs. 37.2 days, p < 0.001). Most healing occurred within 2 months but approximately 20% of each group had not healed after 3 months. CONCLUSION Although mild BSIs heal faster than severe BSIs, nearly 10% of all the BSIs followed as outpatients worsened. Close observation of patients with BSI should continue until healing can be confirmed.


Biochemical Journal | 2012

Erythrocyte storage increases rates of NO and nitrite scavenging: implications for transfusion-related toxicity.

Ryan Stapley; Benjamin Y. Owusu; Angela Brandon; Marianne V. Cusick; Cilina Rodriguez; Marisa B. Marques; Jeffrey D. Kerby; Scott R. Barnum; Jordan A. Weinberg; Jack R. Lancaster; Rakesh P. Patel

Storage of erythrocytes in blood banks is associated with biochemical and morphological changes to RBCs (red blood cells). It has been suggested that these changes have potential negative clinical effects characterized by inflammation and microcirculatory dysfunction which add to other transfusion-related toxicities. However, the mechanisms linking RBC storage and toxicity remain unclear. In the present study we tested the hypothesis that storage of leucodepleted RBCs results in cells that inhibit NO (nitric oxide) signalling more so than younger cells. Using competition kinetic analyses and protocols that minimized contributions from haemolysis or microparticles, our data indicate that the consumption rates of NO increased ~40-fold and NO-dependent vasodilation was inhibited 2-4-fold comparing 42-day-old with 0-day-old RBCs. These results are probably due to the formation of smaller RBCs with increased surface area: volume as a consequence of membrane loss during storage. The potential for older RBCs to affect NO formation via deoxygenated RBC-mediated nitrite reduction was also tested. RBC storage did not affect deoxygenated RBC-dependent stimulation of nitrite-induced vasodilation. However, stored RBCs did increase the rates of nitrite oxidation to nitrate in vitro. Significant loss of whole-blood nitrite was also observed in stable trauma patients after transfusion with 1 RBC unit, with the decrease in nitrite occurring after transfusion with RBCs stored for >25 days, but not with younger RBCs. Collectively, these data suggest that increased rates of reactions between intact RBCs and NO and nitrite may contribute to mechanisms that lead to storage-lesion-related transfusion risk.

Collaboration


Dive into the Jordan A. Weinberg's collaboration.

Top Co-Authors

Avatar

Martin A. Croce

University of Tennessee Health Science Center

View shared research outputs
Top Co-Authors

Avatar

Timothy C. Fabian

University of Tennessee Health Science Center

View shared research outputs
Top Co-Authors

Avatar

Louis J. Magnotti

University of Tennessee Health Science Center

View shared research outputs
Top Co-Authors

Avatar

John P. Sharpe

University of Tennessee Health Science Center

View shared research outputs
Top Co-Authors

Avatar

Jeffrey D. Kerby

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Thomas J. Schroeppel

University of Tennessee Health Science Center

View shared research outputs
Top Co-Authors

Avatar

Loring W. Rue

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Gerald McGwin

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Marianne J. Vandromme

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Russell Griffin

University of Alabama at Birmingham

View shared research outputs
Researchain Logo
Decentralizing Knowledge