Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David A. Spain is active.

Publication


Featured researches published by David A. Spain.


Critical Care Medicine | 1999

Enteral tube feeding in the intensive care unit: factors impeding adequate delivery.

Stephen A. McClave; Leslie K. Sexton; David A. Spain; Joyce L. Adams; Nancy A. Owens; Mary Beth Sullins; Barbara S. Blandford; Harvy L. Snider

OBJECTIVE To evaluate those factors that impact on the delivery of enteral tube feeding. DESIGN Prospective study. SETTING Medical intensive care units (ICU) and coronary care units at two university-based hospitals. PATIENTS Forty-four medical ICU/coronary care unit patients (mean age, 57.8 yrs; 70% male) who were to receive nothing by mouth and were placed on enteral tube feeding. INTERVENTIONS Rate of enteral tube feeding ordered, actual volume delivered, patient position, residual volume, flush volume, presence of blue food coloring in oropharynx, and stool frequency were recorded every 4 hrs. Duration and reason for cessation of enteral tube feeding were documented. MEASUREMENTS AND MAIN RESULTS Physicians ordered a daily mean volume of enteral tube feeding that was 65.6% of goal requirements, but an average of only 78.1% of the volume ordered was actually infused. Thus, patients received a mean volume of enteral tube feeding for all 339 days of infusion that was 51.6% of goal (range, 15.1% to 87.1%). Only 14% of patients reached > or = 90% of goal feeding (for a single day) within 72 hrs of the start of enteral tube feeding infusion. Of 24 patients weighed before and after, 54% were noted to lose weight on enteral tube feeding. Declining albumin levels through the enteral tube feeding period correlated significantly with decreasing percent of goal calories infused (p = .042; r2 = .13). Diarrhea occurred in 23 patients (52.3%) for a mean 38.2% of enteral tube feeding days. In >1490 bedside evaluations, patients were observed to be in the supine position only 0.45%, residual volume of >200 mL was found 2.8%, and blue food coloring was found in the oropharynx 5.1% of the time. Despite this, cessation of enteral tube feeding occurred in 83.7% of patients for a mean 19.6% of the potential infusion time. Sixty-six percent of the enteral tube feeding cessations was judged to be attributable to avoidable causes. CONCLUSIONS The current manner in which enteral tube feeding is delivered in the ICU results in grossly inadequate nutritional support. Barely one half of patient caloric requirements are met because of underordering by physicians and reduced delivery through frequent and often inappropriate cessation of feedings.


Journal of The American College of Surgeons | 2009

Massive Transfusion Protocols: The Role of Aggressive Resuscitation Versus Product Ratio in Mortality Reduction

Daniel J. Riskin; Thomas C. Tsai; Loren Riskin; Tina Hernandez-Boussard; Mary-Anne Purtill; Paul M. Maggio; David A. Spain; Susan I. Brundage

BACKGROUND Exsanguinating hemorrhage necessitating massive blood product transfusion is associated with high mortality rates. Recent data suggest that altering the fresh frozen plasma to packed red blood cell ratio (FFP:PRBC) results in significant mortality reductions. Our purpose was to evaluate mortality and blood product use in the context of a newly initiated massive transfusion protocol (MTP). STUDY DESIGN In July 2005, our American College of Surgeons-verified Level I trauma center implemented an MTP supporting a 1:1.5 FFP:PRBC ratio, improved communications, and enhanced systems flow to optimize rapid blood product availability. During the 4 years surrounding protocol implementation, we reviewed data on trauma patients directly admitted through the emergency department and requiring 10 or more units PRBCs during the first 24 hours. RESULTS For the 2 years before and subsequent to MTP initiation, there were 4,223 and 4,414 trauma activations, of which 40 and 37 patients, respectively, met study criteria. The FFP:PRBC ratios were identical, at 1:1.8 and 1:1.8 (p = 0.97). Despite no change in FFP:PRBC ratio, mortality decreased from 45% to 19% (p = 0.02). Other significant findings included decreased mean time to first product: cross-matched RBCs (115 to 71 minutes; p = 0.02), FFP (254 to 169 minutes; p = 0.04), and platelets (418 to 241 minutes; p = 0.01). CONCLUSIONS MTP implementation is associated with mortality reductions that have been ascribed principally to increased plasma use and decreased FFP:PRBC ratios. Our study found a significant reduction in mortality despite unchanged FFP:PRBC ratios and equivalent overall mean numbers of transfusions. Our data underscore the importance of expeditious product availability and emphasize that massive transfusion is a complex process in which product ratio and time to transfusion represent only the beginning of understanding.


Annals of Surgery | 2000

Evolution in the Management of Hepatic Trauma: A 25-Year Perspective

J. David Richardson; Glen A. Franklin; James K. Lukan; Eddy H. Carrillo; David A. Spain; Frank B. Miller; Mark A. Wilson; Hiram C. Polk; Lewis M. Flint

ObjectiveTo define the changes in demographics of liver injury during the past 25 years and to document the impact of treatment changes on death rates. Summary Background DataNo study has presented a long-term review of a large series of hepatic injuries, documenting the effect of treatment changes on outcome. A 25-year review from a concurrently collected database of liver injuries documented changes in treatment and outcome. MethodsA database of hepatic injuries from 1975 to 1999 was studied for changes in demographics, treatment patterns, and outcome. Factors potentially responsible for outcome differences were examined. ResultsA total of 1,842 liver injuries were treated. Blunt injuries have dramatically increased; the proportion of major injuries is approximately 16% annually. Nonsurgical therapy is now used in more than 80% of blunt injuries. The death rates from both blunt and penetrating trauma have improved significantly through each successive decade of the study. The improved death rates are due to decreased death from hemorrhage. Factors responsible include fewer major venous injuries requiring surgery, improved outcome with vein injuries, better results with packing, and effective arterial hemorrhage control with arteriographic embolization. ConclusionsThe treatment and outcome of liver injuries have changed dramatically in 25 years. Multiple modes of therapy are available for hemorrhage control, which has improved outcome.


Critical Care Medicine | 2005

Poor validity of residual volumes as a marker for risk of aspiration in critically ill patients.

Stephen A. McClave; James K. Lukan; James A. Stefater; Cynthia C. Lowen; Stephen W. Looney; Paul J. Matheson; Kevin Gleeson; David A. Spain

Background and Aims:Elevated residual volumes (RV), considered a marker for the risk of aspiration, are used to regulate the delivery of enteral tube feeding. We designed this prospective study to validate such use. Methods:Critically ill patients undergoing mechanical ventilation in the medical, coronary, or surgical intensive care units in a university-based tertiary care hospital, placed on intragastric enteral tube feeding through nasogastric or percutaneous endoscopic gastrostomy tubes, were included in this study. Patients were fed Probalance (Nestlé USA) to provide 25 kcal/kg per day (to which 109 yellow microscopic beads and 4.5 mL of blue food coloring per 1,500 mL was added). Patients were randomized to one of two groups based on management of RV: cessation of enteral tube feeding for RV >400 mL in study patients or for RV >200 mL in controls. Acute Physiology and Chronic Health Evaluation (APACHE) III, bowel function score, and aspiration risk score were determined. Bedside evaluations were done every 4 hrs for 3 days to measure RV, to detect blue food coloring, to check patient position, and to collect secretions from the trachea and oropharynx. Aspiration/regurgitation events were defined by the detection of yellow color in tracheal/oropharyngeal samples by fluorometry. Analysis was done by analysis of variance, Spearman’s correlation, Student’s t-test, Tukey’s method, and Cochran-Armitage test. Results:Forty patients (mean age, 44.6 yrs; range, 18–88 yrs; 70% male; mean APACHE III score, 40.9 [range, 12–85]) were evaluated (21 on nasogastric, 19 on percutaneous endoscopic gastrostomy feeds) and entered into the study. Based on 1,118 samples (531 oral, 587 tracheal), the mean frequency of regurgitation per patient was 31.3% (range, 0% to 94%), with a mean RV for all regurgitation events of 35.1 mL (range, 0–700 mL). The mean frequency of aspiration per patient was 22.1% (range, 0% to 94%), with a mean RV for all aspiration events of 30.6 mL (range, 0–700 mL). The median RV for both regurgitation and aspiration events was 5 mL. Over a wide range of RV, increasing from 0 mL to >400 mL, the frequency of regurgitation and aspiration did not change appreciably. Aspiration risk and bowel function scores did not correlate with the incidence of aspiration or regurgitation. Blue food coloring was detected on only three of the 1,118 (0.27%) samples. RV was ≤50 mL on 84.1% and >400 mL on 1.4% of bedside evaluations. Sensitivities for detecting aspiration per designated RV were as follows: 400 mL = 1.5%; 300 mL = 2.3%; 200 mL = 3.0%; and 150 mL = 4.5%. Low RV did not assure the absence of events, because the frequency of aspiration was 23.0% when RV was <150 mL. Raising the designated RV for cessation of enteral tube feeding from 200 mL to 400 mL did not increase the risk, because the frequency of aspiration was no different between controls (21.6%) and study patients (22.6%). The frequency of regurgitation was significantly less for patients with percutaneous endoscopic gastrostomy tubes compared with those with nasogastric tubes (20.3% vs. 40.7%, respectively; p = .046). There was no correlation between the incidence of pneumonia and the frequency of regurgitation or aspiration. Conclusions:Blue food coloring should not be used as a clinical monitor. Converting nasogastric tubes to percutaneous endoscopic gastrostomy tubes may be a successful strategy to reduce the risk of aspiration. No appropriate designated RV level to identify aspiration could be derived as a result of poor sensitivity over a wide range of RV. Study results do not support the conventional use of RV as a marker for the risk of aspiration.


Journal of Parenteral and Enteral Nutrition | 2002

North American Summit on Aspiration in the Critically Ill Patient: Consensus Statement

Stephen A. McClave; Mark T. DeMeo; Mark H. DeLegge; James A. DiSario; Daren K. Heyland; James P. Maloney; Norma A. Metheny; Frederick A. Moore; James S. Scolapio; David A. Spain; Gary P. Zaloga

Aspiration is the leading cause of pneumonia in the intensive care unit and the most serious complication of enteral tube feeding (ETF). Although aspiration is common, the clinical consequences are variable because of differences in nature of the aspirated material and individual host responses. A number of defense mechanisms normally present in the upper aerodigestive system that protect against aspiration become compromised by clinical events that occur frequently in the critical care setting, subjecting the patient to increased risk. The true incidence of aspiration has been difficult to determine in the past because of vague definitions, poor assessment monitors, and varying levels of clinical recognition. Standardization of terminology is an important step in helping to define the problem, design appropriate research studies, and develop strategies to reduce risk. Traditional clinical monitors of glucose oxidase strips and blue food coloring (BFC) should no longer be used. A modified approach to use of gastric residual volumes and identification of clinical factors that predispose to aspiration allow for risk stratification and an algorhythm approach to the management of the critically ill patient on ETF. Although the patient with confirmed aspiration should be monitored for clinical consequences and receive supportive pulmonary care, ETF may be continued when accompanied by appropriate steps to reduce risk of further aspiration. Management strategies for treating aspiration pneumonia are based on degree of diagnostic certainty, time of onset, and host factors.


Journal of Trauma-injury Infection and Critical Care | 2001

Penetrating colon injuries requiring resection: Diversion or primary anastomosis? An AAST prospective multicenter study

Demetrios Demetriades; James Murray; Linda Chan; Carlos A. Ordoñez; Douglas M. Bowley; Kimberly Nagy; Edward E. Cornwell; George C. Velmahos; Nestor Munoz; Costas Hatzitheofilou; Schwab Cw; Aurelio Rodriguez; Carol Cornejo; Kimberly A. Davis; Nicholas Namias; David H. Wisner; Rao R. Ivatury; Ernest E. Moore; Jose Acosta; Kimball I. Maull; Michael H. Thomason; David A. Spain; Richard P. Gonzalez; John R. Hall; Harvey Sugarman

BACKGROUND The management of colon injuries that require resection is an unresolved issue because the existing practices are derived mainly from class III evidence. Because of the inability of any single trauma center to accumulate enough cases for meaningful statistical analysis, a multicenter prospective study was performed to compare primary anastomosis with diversion and identify the risk factors for colon-related abdominal complications. METHODS This was a prospective study from 19 trauma centers and included patients with colon resection because of penetrating trauma, who survived at least 72 hours. Multivariate logistic regression analysis was used to compare outcomes in patients with primary anastomosis or diversion and identify independent risk factors for the development of abdominal complications. RESULTS Two hundred ninety-seven patients fulfilled the criteria for inclusion and analysis. Overall, 197 patients (66.3%) were managed by primary anastomosis and 100 (33.7%) by diversion. The overall colon-related mortality was 1.3% (four deaths in the diversion group, no deaths in the primary anastomosis group, p = 0.012). Colon-related abdominal complications occurred in 24% of all patients (primary repair, 22%; diversion, 27%; p = 0.373). Multivariate analysis including all potential risk factors with p values < 0.2 identified three independent risk factors for abdominal complications: severe fecal contamination, transfusion of > or = 4 units of blood within the first 24 hours, and single-agent antibiotic prophylaxis. The type of colon management was not found to be a risk factor. Comparison of primary anastomosis with diversion using multivariate analysis adjusting for the above three identified risk factors or the risk factors previously described in the literature (shock at admission, delay > 6 hours to operating room, penetrating abdominal trauma index > 25, severe fecal contamination, and transfusion of > 6 units blood) showed no statistically significant difference in outcome. Similarly, multivariate analysis and comparison of the two methods of colon management in high-risk patients showed no difference in outcome. CONCLUSION The surgical method of colon management after resection for penetrating trauma does not affect the incidence of abdominal complications, irrespective of associated risk factors. Severe fecal contamination, transfusion of > or = 4 units of blood within the first 24 hours, and single-agent antibiotic prophylaxis are independent risk factors for abdominal complications. In view of these findings, the reduced quality of life, and the need for a subsequent operation in colostomy patients, primary anastomosis should be considered in all such patients.


Journal of Trauma-injury Infection and Critical Care | 1999

Interventional techniques are useful adjuncts in nonoperative management of hepatic injuries.

Eddy H. Carrillo; David A. Spain; Christopher D. Wohltmann; Robert E. Schmieg; Phillip W. Boaz; Frank B. Miller; Richardson Jd; Thomas M. Scalea; S. Brotman; A. A. Meyer; R. I. Gross; S. N. Parks; John R. Hall; H. G. Cryer; R. J. Mullins

BACKGROUND Nonoperative management has become the standard of care for hemodynamically stable patients with complex liver trauma. The benefits of such treatment may be obviated, though, by complications such as arteriovenous fistulas, bile leaks, intrahepatic or perihepatic abscesses, and abnormal communications between the vascular system and the biliary tree (hemobilia and bilhemia). METHODS We reviewed the hospital charts of 135 patients with blunt liver trauma who were treated nonoperatively between July 1995 and December 1997. RESULTS Thirty-two patients (24%) developed complications that required additional interventional treatment. Procedures less invasive than celiotomy were often performed, including arteriography and selective embolization in 12 patients (37%), computed tomography-guided drainage of infected collections in 10 patients (31%), endoscopic retrograde cholangiopancreatography with endoscopic sphincterotomy and biliary endostenting in 8 patients (25%), and laparoscopy in 2 patients (7%). Overall, nonoperative interventional procedures were used successfully to treat these complications in 27 patients (85%). CONCLUSION In hemodynamically stable patients with blunt liver trauma, nonoperative management is the current treatment of choice. In patients with severe liver injuries, however, complications are common. Most untoward outcomes can be successfully managed nonoperatively using alternative therapeutic options. Early use of these interventional procedures is advocated in the initial management of the complications of severe blunt liver trauma.


American Journal of Surgery | 2001

A multicenter evaluation of whether gender dimorphism affects survival after trauma.

Christopher D. Wohltmann; Glen A. Franklin; Phillip W. Boaz; Fred A. Luchette; Paul A. Kearney; J. David Richardson; David A. Spain

BACKGROUND The frequency of women who have sustained severe injuries has increased over the past 30 years. The purpose of this study was to evaluate whether severely injured women have a survival advantage over men. To address this issue, we undertook a multicenter evaluation of the effects of gender dimorphism on survival in trauma patients. METHODS Patient information was collected from the databases of three level I trauma centers. We included all consecutive patients who were admitted to these centers over a 4-year period. We evaluated the effects of age, gender, mechanism of injury, pattern of injury, Abbreviated Injury Score (AIS), and Injury Severity Score (ISS) on survival. RESULTS A total of 20,261 patients were admitted to the three trauma centers. Women who were younger than 50 years of age (mortality rate 5%) experienced a survival advantage over men (mortality rate 7%) of equal age (odds ratio 1.27, P <0.002). This advantage was most notably found in the more severely injured (ISS >25) group (mortality rate 28% in women versus 33% in men). This difference was not attributable to mechanism of injury, severity of injury, or pattern of injury. CONCLUSIONS Severely injured women younger than 50 years of age have a survival advantage when compared with men of equal age and injury severity. Young men have a 27% greater chance of dying than women after trauma. We conclude that gender dimorphism affects the survival of patients after trauma.


Journal of Trauma-injury Infection and Critical Care | 2005

Transfusions result in pulmonary morbidity and death after a moderate degree of injury.

Martin A. Croce; Elizabeth A. Tolley; Jeffrey A. Claridge; Timothy C. Fabian; Roxanne R. Roberts; David A. Spain; James G. Tyburski

BACKGROUND Prior studies have suggested that blood transfusion (Tx) is associated with infectious and respiratory complications in trauma patients. However, these studies are difficult to interpret because of small sample size, inclusion of severely injured patients in traumatic shock, and combination of a variety of unrelated low-morbidity/mortality infections, such as wound, catheter-related, and urinary tract infection as outcomes. To eliminate these confounding variables, this study evaluates the association between delayed Tx and serious, well-defined respiratory complications (ventilator-associated pneumonia [VAP] and acute respiratory distress syndrome [ARDS]) and death in a cohort of intensive care unit (ICU) admissions with less severe (Injury Severity Score [ISS] < 25) blunt trauma who received no Tx within the initial 48 hours after admission. METHODS Patients with blunt injury and ISS < 25 admitted to the ICU over a 7-year period were identified from the registry and excluded if within 48 hours from admission they received any Tx or if they died. VAP required quantitative bronchoalveolar lavage culture (> or =10(5) colonies/mL), and ARDS required Pao2/Fio2 ratio < 200 mm Hg, *** no congestive heart failure, diffuse bilateral infiltrates, and peak airway pressure > 50 cm H2O for diagnosis. Outcomes were VAP, ARDS, and death. RESULTS Nine thousand one hundred twenty-six with blunt injury were ICU admissions, and 5,260 (58%) met study criteria (72% male). Means for age, ISS, and Glasgow Coma Scale score were 39, 12, and 14, respectively. There were 778 (15%) who received delayed Tx. Incidences of VAP, ARDS, and death were 5%, 1%, and 1%, respectively. Logistic regression analysis identified age, base excess, chest Abbreviated Injury Scale score, ISS, and any transfusion as significant predictors for VAP; chest Abbreviated Injury Scale score and transfusion as significant predictors for ARDS; and age and transfusion as significant predictors for death. CONCLUSION Delayed transfusion is independently associated with VAP, ARDS, and death in trauma patients regardless of injury severity. These data mandate a judicious transfusion policy after resuscitation and emphasize the need for safe and effective blood substitutes and transfusion alternatives.


Journal of Trauma-injury Infection and Critical Care | 1996

Predicting the Need to Pack Early for Severe Intra-abdominal Hemorrhage

J. R. Garrison; Richardson Jd; A. S. Hilakos; David A. Spain; Mark A. Wilson; Frank B. Miller; Robert L. Fulton; D. E. Barker; M. F. Rotondo; David H. Wisner; D. V. Feliciano; S. M. Steinberg; Matthew J. Wall

OBJECTIVE To determine if the decision to pack for hemorrhage could be refined. MATERIALS AND METHODS Seventy consecutive trauma patients for whom packing was used to control hemorrhage were studied. The patients had liver injuries, abdominal vascular injuries, and bleeding retroperitoneal hematomas. Preoperative variables were analyzed and survivors compared with nonsurvivors. RESULTS Packing controlled hemorrhage in 37 (53%) patients. Significant differences (p < 0.05) between survivors and nonsurvivors were Injury Severity Score (29 vs. 38), initial pH (7.3 vs. 7.1), platelet count (229,000 vs. 179,000/mm3), prothrombin time (14 vs. 22 seconds), partial thromboplastin time (42 vs. 69 seconds), and duration of hypotension (50 vs. 90 minutes). Nonsurvivors received 20 units of packed red blood cells before packing compared to 13 units for survivors. CONCLUSION Patients who suffer severe injury, hypothermia, refractory hypotension, coagulopathy, and acidosis need early packing if they are to survive. Failure to control hemorrhage is related to severity of injury and delay in the use of pack tamponade. A specific protocol that mandates packing when parameters reach a critical limit should be considered.

Collaboration


Dive into the David A. Spain's collaboration.

Top Co-Authors

Avatar

Mark A. Wilson

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. David Richardson

University of Texas Health Science Center at San Antonio

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge