Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marianne J. Vandromme is active.

Publication


Featured researches published by Marianne J. Vandromme.


Journal of Trauma-injury Infection and Critical Care | 2011

Identifying risk for massive transfusion in the relatively normotensive patient: utility of the prehospital shock index.

Marianne J. Vandromme; Russell Griffin; Jeffrey D. Kerby; Gerald McGwin; Loring W. Rue; Jordan A. Weinberg

BACKGROUND In the prehospital environment, the failure of medical providers to recognize latent physiologic derangement in patients with compensated shock may risk undertriage. We hypothesized that the shock index (SI; heart rate divided by systolic blood pressure [SBP]), when used in the prehospital setting, could facilitate the identification of such patients. The objective of this study was to assess the association between the prehospital SI and the risk of massive transfusion (MT) in relatively normotensive blunt trauma patients. METHODS Admissions to a Level I trauma center between January 2000 and October 2008 with blunt mechanism of injury and prehospital SBP>90 mm Hg were identified. Patients were categorized by SI, calculated for each patient from prehospital vital signs. Risk ratios (RRs) and 95% confidence intervals (CI) for requiring MT (>10 red blood cell units within 24 hours of admission) were calculated using SI>0.5 to 0.7 (normal range) as the referent for all comparisons. RESULTS A total of 8,111 patients were identified, of whom 276 (3.4%) received MT. Compared with patients with normal SI, there was no significant increased risk for MT for patients with a SI of ≤0.5 (RR, 1.41; 95% CI, 0.90-2.21) or>0.7 to 0.9 (RR, 1.06; 95% CI, 0.77-1.45). However, a significantly increased risk for MT was observed for patients with SI>0.9. Specifically, patients with SI>0.9 to 1.1 were observed to have a 1.5-fold increased risk for MT (RR, 1.61; 95% CI, 1.13-2.31). Further increases in SI were associated with incrementally higher risks for MT, with an more than fivefold increase in patients with SI>1.1 to 1.3 (RR, 5.57; 95% CI, 3.74-8.30) and an eightfold risk in patients with SI>1.3 (RR, 8.13; 95% CI, 4.60-14.36). CONCLUSION Prehospital SI>0.9 identifies patients at risk for MT who would otherwise be considered relatively normotensive under current prehospital triage protocols. The risk for MT rises substantially with elevation of SI above this level. Further evaluation of SI in the context of trauma system triage protocols is warranted to analyze whether it triage precision might be augmented among blunt trauma patients with SBP>90 mm Hg.


Journal of The American College of Surgeons | 2010

Lactate Is a Better Predictor than Systolic Blood Pressure for Determining Blood Requirement and Mortality: Could Prehospital Measures Improve Trauma Triage?

Marianne J. Vandromme; Russell Griffin; Jordan A. Weinberg; Loring W. Rue; Jeffrey D. Kerby

BACKGROUND Standard hemodynamic evaluation of patients in shock may underestimate severity of hemorrhage given physiologic compensation. Blood lactate (BL) is an important adjunct in characterizing shock, and point-of-care devices are currently available for use in the prehospital (PH) setting. The objective of this study was to determine if BL levels have better predictive value when compared with systolic blood pressure (SBP) for identifying patients with an elevated risk of significant transfusion and mortality in a hemodynamically indeterminant cohort. STUDY DESIGN We selected trauma patients admitted to a level I trauma center over a 9-year period with SBP between 90 and 110 mmHg. The predictive capability of initial emergency department (ED) BL for needing > or =6 units packed RBCs within 24 hours postinjury and mortality was compared with PH-SBP and ED-SBP by comparing estimated area under the receiver operator curve (AUC). RESULTS We identified 2,413 patients with ED-SBP and 787 patients with PH-SBP and ED-BL. ED-BL was statistically better than PH-SBP (p = 0.0025) and ED-SBP (p < 0.0001) in predicting patients who will need > or = 6 U packed RBCs within 24 hours postinjury (AUC: ED-BL, 0.72 vs PH-SBP, 0.61; ED-BL, 0.76 vs ED-SBP, 0.60). ED-BL was also a better predictor than both PH-SBP (p = 0.0235) and ED-SBP (p < 0.0001) for mortality (AUC: ED-BL, 0.74 vs PH-SBP, 0.60; ED-BL, 0.76 vs ED-SBP, 0.61). CONCLUSIONS ED-BL is a better predictor than SBP in identifying patients requiring significant transfusion and mortality in this cohort with indeterminant SBP. These findings suggest that point-of-care BL measurements could improve trauma triage and better identify patients for enrollment in interventional trials. Further studies using BL measurement in the PH environment are warranted.


Journal of Trauma-injury Infection and Critical Care | 2010

Duration of red cell storage influences mortality after trauma.

Jordan A. Weinberg; Gerald McGwin; Marianne J. Vandromme; Marisa B. Marques; Sherry M. Melton; Donald A. Reiff; Jeffrey D. Kerby; Loring W. Rue

BACKGROUND Although previous studies have identified an association between the transfusion of relatively older red blood cells (RBCs) (storage ≥ 14 days) and adverse outcomes, they are difficult to interpret because the majority of patients received a combination of old and fresh RBC units. To overcome this limitation, we compared in-hospital mortality among patients who received exclusively old versus fresh RBC units during the first 24 hours of hospitalization. METHODS Patients admitted to a Level I trauma center between January 2000 and May 2009 who received ≥ 1 unit of exclusively old (≥ 14 days) vs. fresh (< 14 days) RBCs during the first 24 hours of hospitalization were identified. Risk ratios (RRs) and 95% confidence intervals (CIs) were calculated for the association between mortality and RBC age, adjusted for patient age, Injury Severity Score, gender, receipt of fresh frozen plasma or platelets, RBC volume, brain injury, and injury mechanism (blunt or penetrating). RESULTS One thousand six hundred forty-seven patients met the study inclusion criteria. Among patients who were transfused 1 or 2 RBC units, no difference in mortality with respect to RBC age was identified (adjusted RR, 0.97; 95% CI, 0.72-1.32). Among patients who were transfused 3 or more RBC units, receipt of old versus fresh RBCs was associated with a significantly increased risk of mortality, with an adjusted RR of 1.57 (95% CI, 1.14-2.15). No difference was observed concerning the mean number of old versus fresh units transfused to patients who received 3 or more units (6.05 vs. 5.47, respectively; p = 0.11). CONCLUSION In trauma patients undergoing transfusion of 3 or more RBC units within 24 hour of hospital arrival, receipt of relatively older blood was associated with a significantly increased mortality risk. Reservation of relatively fresh RBC units for the acutely injured may be advisable.


Scandinavian Journal of Trauma, Resuscitation and Emergency Medicine | 2009

Blood transfusion in the critically ill: does storage age matter?

Marianne J. Vandromme; Gerald McGwin; Jordan A. Weinberg

Morphologic and biochemical changes occur during red cell storage prior to product expiry, and these changes may hinder erythrocyte viability and function following transfusion. Despite a relatively large body of literature detailing the metabolic and structural deterioration that occurs during red cell storage, evidence for a significant detrimental clinical effect related to the transfusion of older blood is relatively less conclusive, limited primarily to observations in retrospective studies. Nonetheless, the implication that the transfusion of old, but not outdated blood may have negative clinical consequences demands attention. In this report, the current understanding of the biochemical and structural changes that occur during storage, known collectively as the storage lesion, is described, and the clinical evidence concerning the detrimental consequences associated with the transfusion of relatively older red cells is critically reviewed. Although the growing body of literature demonstrating the deleterious effects of relatively old blood is compelling, it is notable that all of these reports have been retrospective, and most of these studies have evaluated patients who received a mixture of red cell units of varying storage age. Until prospective studies have been completed and produce confirmative results, it would be premature to recommend any modification of current transfusion practice regarding storage age.In 1917, Frances Payton Rous and J.R. Turner identified that a citrate-glucose solution allowed for the preservation of a whole blood unit for up to five days, thus facilitating the formative practice of blood banking[1]. Later, Loutit and Mollison of Great Britain developed the first anticoagulant of the modern era, known as acid-citrate-dextrose (ACD)[1]. ACD extended the shelf life of refrigerated blood to 21 days, and ACD remained in wide spread usage until the 1960s, when it was replaced by citrate-phosphate-dextrose (CPD) and citrate-phosphate-dextrose-adenine (CPDA) solutions that increased shelf life to 35 days and 42 days respectively. More recently, additive solutions containing saline, adenine, and dextrose have been developed to augment red cell survival following transfusion, although without any direct increase in storage duration[1, 2].It is now well appreciated, however, that a number of morphologic and biochemical changes occur during red cell storage prior to product expiry, and these changes may hinder erythrocyte viability and function following transfusion. Despite a relatively large body of literature detailing the metabolic and structural deterioration that occurs during red cell storage, evidence for a significant detrimental clinical effect related to the transfusion of older blood is relatively less conclusive, limited primarily to observations in retrospective studies. Nonetheless, the implication that the transfusion of old, but not outdated blood may have negative clinical consequences demands attention. The purpose of this report is to describe the current understanding of the biochemical and structural changes that occur during storage, known collectively as the storage lesion, and to critically review the clinical evidence concerning the detrimental consequences associated with the transfusion of relatively older red cells.


Journal of Surgical Education | 2009

Proficiency-Based Laparoscopic and Endoscopic Training With Virtual Reality Simulators: A Comparison of Proctored and Independent Approaches

Christopher W. Snyder; Marianne J. Vandromme; Sharon L. Tyra; Mary T. Hawn

BACKGROUND Virtual reality (VR) simulators for laparoscopy and endoscopy may be valuable tools for resident education. However, the cost of such training in terms of trainee and instructor time may vary depending upon whether an independent or proctored approach is employed. METHODS We performed a randomized controlled trial to compare independent and proctored methods of proficiency-based VR simulator training. Medical students were randomized to independent or proctored training groups. Groups were compared with respect to the number of training hours and task repetitions required to achieve expert level proficiency on laparoscopic and endoscopic simulators. Cox regression modeling was used to compare time to proficiency between groups, with adjustment for appropriate covariates. RESULTS Thirty-six medical students (18 independent, 18 proctored) were enrolled. Achievement of overall simulator proficiency required a median of 11 hours of training (range, 6-21 hours). Laparoscopic and endoscopic proficiency were achieved after a median of 11 (range, 6-32) and 10 (range, 5-27) task repetitions, respectively. The number of repetitions required to achieve proficiency was similar between groups. After adjustment for covariates, trainees in the independent group achieved simulator proficiency with significantly fewer hours of training (hazard ratio, 2.62; 95% confidence interval, 1.01-6.85; p = 0.048). CONCLUSIONS Our study quantifies the cost, in instructor and trainee hours, of proficiency-based laparoscopic and endoscopic VR simulator training, and suggests that proctored instruction does not offer any advantages to trainees. The independent approach may be preferable for surgical residency programs desiring to implement VR simulator training.


Journal of Trauma-injury Infection and Critical Care | 2009

Management of colon wounds in the setting of damage control laparotomy: a cautionary tale.

Jordan A. Weinberg; Russell Griffin; Marianne J. Vandromme; Sherry M. Melton; Richard L. George; Donald A. Reiff; Jeffrey D. Kerby; Loring W. Rue

BACKGROUND Although colon wounds are commonly treated in the setting of damage control laparotomy (DCL), a paucity of data exist to guide management. The purpose of this study was to evaluate our experience with the management of colonic wounds in the context of DCL, using colonic wound outcomes after routine, single laparotomy (SL) as a benchmark. METHODS Consecutive patients during a 7-year period with full-thickness or devitalizing colon injury were identified. Early deaths (<48 hour) were excluded. Colon-related complications (abscess, suture or staple leak, and stomal ischemia) were compared between those managed in the setting of DCL versus those managed by SL, both overall and as stratified by procedure (primary repair, resection and anastomosis, and resection and colostomy). RESULTS One hundred fifty-seven patients met study criteria: 101 had undergone SL and 56 had undergone DCL. Comparison of DCL patients with SL patients was notable for a significant difference in colon-related complications (30% vs. 12%, p < 0.005) and suture/staple leak in particular (12% vs. 3%, p < 0.05). Stratification by procedure revealed a significant difference in colon-related complications among those that underwent resection and anastomosis (DCL: 39% vs. SL: 18%, p < 0.05), whereas no differences were observed in those who underwent primary repair or resection and colostomy. CONCLUSIONS Management of colonic wounds in the setting of DCL is associated with a relatively high incidence of complications. The excessive incidence of leak overall and morbidity particular to resection and anastomosis, however, give us pause. Although stoma construction is not without its own complications in the setting of DCL, it may be the safer alternative.


Biochemical Journal | 2009

Sodium nitrite therapy attenuates the hypertensive effects of HBOC-201 via nitrite reduction

Cilina Rodriguez; Dario A. Vitturi; Jin He; Marianne J. Vandromme; Angela Brandon; Anne Hutchings; Loring W. Rue; Jeffrey D. Kerby; Rakesh P. Patel

Hypertension secondary to scavenging of NO remains a limitation in the use of HBOCs (haemoglobin-based oxygen carriers). Recent studies suggest that nitrite reduction to NO by deoxyhaemoglobin supports NO signalling. In the present study we tested whether nitrite would attenuate HBOC-mediated hypertension using HBOC-201 (Biopure), a bovine cross-linked, low-oxygen-affinity haemoglobin. In a similar way to unmodified haemoglobin, deoxygenated HBOC-201 reduced nitrite to NO with rates directly proportional to the extent of deoxygenation. The functional importance of HBOC-201-dependent nitrite reduction was demonstrated using isolated aortic rings and a murine model of trauma, haemorrhage and resuscitation. In the former, HBOC-201 inhibited NO-donor and nitrite-dependent vasodilation when oxygenated. However, deoxygenated HBOC-201 failed to affect nitrite-dependent vasodilation but still inhibited NO-donor dependent vasodilation, consistent with a model in which nitrite-reduction by deoxyHBOC-201 counters NO scavenging. Finally, resuscitation using HBOC-201, after trauma and haemorrhage, resulted in mild hypertension ( approximately 5-10 mmHg). Administration of a single bolus nitrite (30-100 nmol) at the onset of HBOC-201 resuscitation prevented hypertension. Nitrite had no effect on mean arterial pressure during resuscitation with LR (lactated Ringers solution), suggesting a role for nitrite-HBOC reactions in attenuating HBOC-mediated hypertension. Taken together these data support the concept that nitrite can be used as an adjunct therapy to prevent HBOC-dependent hypertension.


Critical Care | 2008

Progesterone in traumatic brain injury: time to move on to phase III trials

Marianne J. Vandromme; Sherry M. Melton; Jeffrey D. Kerby

There are several candidate neuroprotective agents that have been shown in preclinical testing to improve outcomes following traumatic brain injury (TBI). Xiao and colleagues have performed an in hospital, double blind, randomized, controlled clinical trial utilizing progesterone in the treatment of patients sustaining TBI evaluating safety and long term clinical outcomes. These data, combined with the results of the previously published ProTECT trial, show progesterone to be safe and potentially efficacious in the treatment of TBI. Larger phase III trials will be necessary to verify results prior to clinical implementation. Clinical trials networks devoted to the study of TBI are vital to the timely clinical testing of these candidate agents and need to be supported.


Journal of Trauma-injury Infection and Critical Care | 2011

Intubation patterns and outcomes in patients with computed tomography-verified traumatic brain injury.

Marianne J. Vandromme; Sherry M. Melton; Russell Griffin; Gerald McGwin; Jordan A. Weinberg; Michael Minor; Loring W. Rue; Jeffrey D. Kerby

BACKGROUND Studies evaluating traumatic brain injury (TBI) patients have shown an association between prehospital (PH) intubation and worse outcomes. However, previous studies have used surrogates, e.g., Glasgow Coma Scale (GCS) score ≤8 and Abbreviated Injury Severity Scale (AIS) score ≥3, which may overestimate the true presence of TBI. This study evaluated the impact of PH intubation in patients with PH GCS score ≤8 and radiographically proven TBI. METHODS Trauma patients routed to a Level I trauma center over a 3-year period with blunt injury and PH GCS score ≤8 were included. PH and in-hospital records were linked and head computed tomography scans were assigned a Marshall Score (MS). Patients with TBI (MS >1) were categorized into groups based on intubation status (PH, emergency department [ED], and no intubation). Comparisons were made using analysis of variance and χ statistics. Mortality differences, crude and adjusted risk ratios (RRs), and 95% confidence intervals (CIs) were calculated using proportions hazards modeling. RESULTS Of 334 patients with PH GCS score ≤8, 149 (50%) had TBI by MS. Among the TBI patients, 42.7% of patients were PH intubated, 47.7% were ED intubated, and 9.4% were not intubated during the initial resuscitation. Intubated patients had lower ED GCS score (PH: 4.1 and ED: 5.9 vs. 14.0; p < 0.0001) compared with patients not intubated. Also PH intubated patients had higher mean Injury Severity Score (38.0 vs. 33.7 vs. 23.5, p < 0.001) when compared with ED intubated and nonintubated patients. None of the nonintubated patients had a MS >2. Mortality for TBI patients who required PH intubation was 46.9% and 41.4% among ED-intubated patients. The crude RR of mortality for PH compared with ED intubation was 1.13 (95% CI, 0.68-1.89), and remained nonsignificant (RR, 0.68; 95% CI, 0.36-1.19) when adjusted for key markers of injury severity. CONCLUSIONS Patients with PH GCS score ≤8 and proven TBI had a high overall rate of intubation (>90%). PH intubation seems to be a marker for more severe injury and conveyed no increased risk for mortality over ED intubation.


Journal of Trauma-injury Infection and Critical Care | 2010

Computed tomography identification of latent pseudoaneurysm after blunt splenic injury: pathology or technology?

Jordan A. Weinberg; Mark E. Lockhart; Abhishek D. Parmar; Russell Griffin; Sherry M. Melton; Marianne J. Vandromme; Gerald McGwin; Loring W. Rue

BACKGROUND Serial computed tomography (CT) imaging of blunt splenic injury can identify the latent formation of splenic artery pseudoaneurysms (PSAs), potentially contributing to improved success in nonoperative management. However, it remains unclear whether the delayed appearance of such PSAs is truly pathophysiologic or attributable to imaging quality and timing. The objective of this study was to evaluate the influence of recent advancements in imaging technology on the incidence of the latent PSA. METHODS Consecutive patients with blunt splenic injury over 4.5 years were identified from our trauma registry. Follow-up CT was performed for all but low-grade injuries 24 hours to 48 hours after initial CT. Incidences of both early and latent PSA formation were reviewed and compared with respect to imaging technology (4-slice vs. >or=16-slice). RESULTS A total of 411 patients were selected for nonoperative management of blunt splenic injury. Of these, 135 had imaging performed with 4-slice CT, and 276 had imaging performed with CTs of >=16-slice. Mean follow-up was 75 days (range, 1-1178 days) and 362 patients (88%) had follow-up beyond 7 days. Comparing 4-slice CT with >or=16-slice CT, there were no significant differences in the incidence of early PSA (3.7% vs. 4.7%; p = 0.91) or latent PSA (2.2% vs. 2.9%; p = 0.90). In both groups, latent PSAs accounted for approximately 38% of all PSAs observed. Splenic injury grade on initial CT was not associated with latent PSA (p = 0.54). Overall, the failure rate of nonoperative management was 7.3%. Overall mortality was 4.6%. No mortalities were related to splenic or other intra-abdominal injury. CONCLUSIONS The incidences of both early and latent PSA have remained remarkably stable despite advances in CT technology. This suggests that latent PSA is not a result of imaging technique but perhaps a true pathophysiologic phenomenon. Injury grade is unhelpful concerning the prediction of latent PSA formation.

Collaboration


Dive into the Marianne J. Vandromme's collaboration.

Top Co-Authors

Avatar

Jeffrey D. Kerby

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Jordan A. Weinberg

University of Tennessee Health Science Center

View shared research outputs
Top Co-Authors

Avatar

Loring W. Rue

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Gerald McGwin

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Russell Griffin

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Sherry M. Melton

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Christopher W. Snyder

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sharon L. Tyra

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Donald A. Reiff

University of Alabama at Birmingham

View shared research outputs
Researchain Logo
Decentralizing Knowledge