Edward H. Kincaid
Wake Forest University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Edward H. Kincaid.
The Annals of Thoracic Surgery | 2000
Edward H. Kincaid; Timothy J. Jones; William R. Brown; Dixon M. Moody; Dwight D. Deal; John W. Hammon
BACKGROUND Microembolization during cardiopulmonary bypass (CPB) can be detected in the brain as lipid deposits that create small capillary and arteriolar dilations (SCADs) with ischemic injury and neuronal dysfunction. SCAD density is increased with the use of cardiotomy suction to scavenge shed blood. Our purpose was to determine whether various methods of processing shed blood during CPB decrease cerebral lipid microembolic burden. METHODS After hypothermic CPB (70 minutes), brain tissue from two groups of mongrel dogs (28 to 35 kg) was examined for the presence of SCADs. In the arterial filter (AF) group (n = 12), shed blood was collected in a cardiotomy suction reservoir and reinfused through the arterial circuit. Three different arterial line filters (Pall LeukoGuard, Pall StatPrime, Bentley Duraflo) were used alone and in various combinations. In the cell saver (CS) group (n = 12), shed blood was collected in a cell saver with intermittent preocessing (Medtronic autoLog model) or a continuous-action cell saver (Fresenius Continuous Auto Transfusion System) and reinfused with and without leukocyte filtration through the CPB circuit. RESULTS Mean SCAD density (SCAD/cm2) in the CS group was less than the AF group (11 +/- 3 vs 24 +/- 5, p = 0.02). There were no significant differences in SCAD density with leukocyte filtration or with the various arterial line filters. Mean SCAD density for the continuous-action cell saver was 8 +/- 2 versus 13 +/- 5 for the intermittent-action device. CONCLUSIONS Use of a cell saver to scavenge shed blood during CPB decreases cerebral lipid microembolization.
Journal of The American College of Surgeons | 1998
Edward H. Kincaid; Preston R. Miller; J. Wayne Meredith; Naeem Rahman; Michael C. Chang
BACKGROUND In trauma patients, the admission value of arterial base deficit stratifies injury severity, predicts complications, and is correlated with arterial lactate concentration. In theory, elevated base deficit and lactate concentrations after shock are related to oxygen transport imbalance at the cellular level. The purpose of this study was to test the hypothesis that an elevated base deficit in trauma patients is indicative of impaired systemic oxygen utilization and portends poor outcomes. METHODS This study was a retrospective analysis of a prospectively collected database. The study population included all patients admitted to the trauma intensive care unit at a Level 1 trauma center during a 12-month period who were monitored with a pulmonary artery catheter and serial measurements of lactate and base deficit, and who achieved a normal arterial lactate concentration (< 2.2 mmol/L) with resuscitation. The patients were divided into those who maintained a persistently high base deficit (> or = 4 mmol/L) and those who achieved a low base deficit (< 4 mmol/L) during resuscitation. RESULTS One-hundred patients (mortality 20%) were monitored with a pulmonary artery catheter and achieved a normal arterial lactate concentration. The mean age+/-SD (SEM) of the group was 37+/-17 years and the Injury Severity Score was 25+/-11. Subgroup analysis revealed that patients with a persistently high base deficit (n=26) had higher rates of multiple organ failure (35% versus 5%, p < 0.001) and death (50% versus 9%, p < 0.00001) compared with patients who achieved a low base deficit. Patients with a persistently high base deficit also had lower oxygen consumption (126+/-40 mL/m2 versus 156+/-30 mL/m2, p=0.01 at 48 hours) and a lower oxygen utilization coefficient (0.20+/-0.05 versus 0.24+/-0.03, p=0.01 at 48 hours) compared with patients with a low base deficit. At 48 hours, both oxygen consumption (r=-0.44, [r, correlation coefficient] p=0.002) and oxygen utilization (r=-0.46, p=0.001) had a significant negative correlation with base deficit. CONCLUSIONS In trauma patients, a persistently high arterial base deficit is associated with altered oxygen utilization and an increased risk of multiple organ failure and mortality. Serial monitoring of base deficit may be useful in assessing the adequacy of oxygen transport and resuscitation.
Chest | 2009
Todd Miano; Marc G. Reichert; Timothy T. Houle; Drew A. MacGregor; Edward H. Kincaid; David L. Bowton
BACKGROUND Stress ulcer prophylaxis (SUP) using ranitidine, a histamine H2 receptor antagonist, has been associated with an increased risk of ventilator-associated pneumonia. The proton pump inhibitor (PPI) pantoprazole is also commonly used for SUP. PPI use has been linked to an increased risk of community-acquired pneumonia. The objective of this study was to determine whether SUP with pantoprazole increases pneumonia risk compared with ranitidine in critically ill patients. METHODS The cardiothoracic surgery database at our institution was used to identify retrospectively all patients who had received SUP with pantoprazole or ranitidine, without crossover between agents. From January 1, 2004, to March 31, 2007, 887 patients were identified, with 53 patients excluded (pantoprazole, 30 patients; ranitidine, 23 patients). Our analysis compared the incidence of nosocomial pneumonia in 377 patients who received pantoprazole with 457 patients who received ranitidine. RESULTS Nosocomial pneumonia developed in 35 of the 377 patients (9.3%) who received pantoprazole, compared with 7 of the 457 patients (1.5%) who received ranitidine (odds ratio [OR], 6.6; 95% confidence interval [CI], 2.9 to 14.9). Twenty-three covariates were used to estimate the probability of receiving pantoprazole as measured by propensity score (C-index, 0.77). Using this score, pantoprazole and ranitidine patients were stratified according to their probability of receiving pantoprazole. After propensity adjusted, multivariable logistic regression, pantoprazole treatment was found to be an independent risk factor for nosocomial pneumonia (OR, 2.7; 95% CI, 1.1 to 6.7; p = 0.034). CONCLUSION The use of pantoprazole for SUP was associated with a higher risk of nosocomial pneumonia compared with ranitidine. This relationship warrants further study in a randomized controlled trial.
Journal of Trauma-injury Infection and Critical Care | 1998
Preston R. Miller; Edward H. Kincaid; Meredith Jw; Michael C. Chang
BACKGROUND The gastric intramucosal pH (pHi) and gastric mucosal-arterial CO2 gap (GAP) estimate visceral perfusion and predict outcome. Threshold values of these variables for use during resuscitation, however, remain poorly defined. The purpose of this study was to develop clinically derived cutoffs for both pHi and GAP for predicting death and multiple organ failure (MOF) in trauma patients. METHODS This was a cohort study of 114 consecutive trauma patients who had pHi determined at 24 hours after intensive care unit admission. The corresponding GAP for each of these values of pHi was obtained through chart review. Receiver operating characteristic curves were constructed for both pHi and GAP with respect to death and MOF. These curves were used to determine the value of each variable that maximized the sum of sensitivity and specificity in predicting outcome. chi2 tests and odds ratios were used to determine if significant differences in outcome occurred above and below these cutoff values. RESULTS Of 114 patients who had pHi determined at 24 hours after admission, 108 had corresponding GAP values available. The values of pHi and GAP that maximized sensitivity and specificity were 7.25 and 18 mm Hg, respectively. The odds ratio for pHi versus death was 4.6 and for pHi versus MOF was 4.3. The odds ratios for GAP versus death and MOF were 2.9 and 3.3, respectively. CONCLUSION In trauma patients, the ability to predict death and MOF is maximized at values of pHi less than 7.25 and GAP greater than 18 mm Hg. These values represent clinically derived cutoffs that should be useful for evaluating the adequacy of intestinal perfusion during resuscitation.
The Annals of Thoracic Surgery | 2003
Edward H. Kincaid; Michelle L Monroe; David L. Saliba; Neal D. Kon; Wesley G. Byerly; Marc G. Reichert
BACKGROUND We examined the effects of preoperative administration of enoxaparin (ENOX), a low-molecular-weight heparin, on bleeding indices and transfusion rates in patients undergoing coronary artery bypass grafting (CABG). METHODS Patients undergoing isolated CABG between 1997 and 2002 who received preoperative ENOX or a continuous infusion of unfractionated heparin (UFH) were randomly divided into three groups: continuous UFH, ENOX last administered more than 12 hours before surgery (ENOX > 12), and ENOX administered less than 12 hours before surgery (ENOX < 12). Perioperative hemoglobin values, transfusion rates, and bleeding complications were compared. RESULTS A total of 69, 58, and 34 patients comprised the UFH, ENOX > 12, and ENOX < 12 groups, respectively. Preoperative demographics and hematologic data were similar among the groups. Compared with the UFH group, the ENOX < 12 group had significantly lower postoperative hemoglobin values (9.6 +/- 1.3 g/dL versus 10.4 +/- 1.2 g/dL, p < 0.05), higher transfusion rates (73.5% versus 50.7%, p < 0.05), and required more total packed red cells per patient (882 +/- 809 mL versus 472 +/- 626 mL, p < 0.05). A nonsignificant increase was noted in the risk of returning to the operating room for bleeding in patients who had received ENOX compared with patients receiving UFH (6.5% versus 2.9%). CONCLUSIONS The preoperative use of ENOX less than 12 hours before CABG is associated with lower postoperative hemoglobin values and higher rates of transfusion than continuous UFH.
Journal of Trauma-injury Infection and Critical Care | 2001
Edward H. Kincaid; Michael C. Chang; R. W. Letton; John G. Chen; J. Wayne Meredith
BACKGROUND The base deficit, an important indicator of physiologic derangement after severe injury in adults, has not been specifically examined in the pediatric trauma population. The purpose of this study was to assess the ability of the admission base deficit to predict injury severity and outcome in the pediatric trauma population. METHODS The study group included all patients in the National Trauma Data Bank over a 2-year period aged 0 to 12 years with a base deficit (0 to -30 mEq/L) recorded from the emergency department. Age, presence of a severe closed head injury, and base deficit were analyzed with respect to mortality and other indicators of injury severity. RESULTS A total of 515 patients constituted the study group. Base deficit less than -4 mEq/L (p < 0.001) and the presence of a closed head injury (odds ratio, 3.8; p < 0.05) were predictors of mortality. For the group, an admission base deficit of -8 mEq/L corresponded to a probability of mortality of 25%. Significant correlations were found between base deficit and emergency department systolic blood pressure, Injury Severity Score, and Revised Trauma Score. There was no relationship between age and mortality. CONCLUSION In injured children, the admission base deficit reflects injury severity and predicts mortality. The probability of mortality increases precipitously in children with a base deficit less than -8 mEq/L, and should alert the clinician to the presence of potentially lethal injuries or uncompensated shock.
Shock | 1998
Edward H. Kincaid; Preston R. Miller; Meredith Jw; Michael C. Chang
Inadequate splanchnic perfusnin, detected as a low gastric intramucosal pH (pHi), in the face of normal systemic perfusion predicts an increased risk for multiple organ failure after trauma. Although the exact etiology of this low pHi is unknown, angiotensin II is thought to be an important regulator of gut perfusion during and after resuscitation from shock. The purpose of this study is to determine whether enalaprilat, an angiotensin-converting enzyme inhibitor, improves gut perfusion in critically injured patients. To test this hypothesis, 18 trauma patients monitored with a nasogastric tonometer and a pulmonary artery catheter were enrolled in a prospective study. A single dose of enalaprilat, .625 mg, was given as an i.v. bolus or a 4 h infusion following systemic resuscitation. Pre- and postdrug tonometric and hemodynamic data, including cardiac index, mean arterial pressure, right ventricular end-diastolic volume index, systemic vascular resistance index, and oxygen transport variables were compared using the paired t test. Results demonstrate that pHi was significantly improved after 4 h (7.13 ± .04 to 7.19 ± .03, p = .03) and after 24 h compared with baseline (7.14 ± .04 to 7.25 ± .04, p = .04). Overall, pHi increased in 12 of 18 patients. No significant differences were observed in any of the studied hemodynamic or systemic perfusion variables including mean arterial pressure (92 ± 4 to 87 ± 4, p = .24) and oxygen delivery (669 ± 33 to 675 ± 32, p = .82). In examining the determinants of pHi, the intramucosal-arterial Pco2 difference was improved after enalaprilat administration (27 ± 6 to 17 ± 3 mmHg, p = .04) while no difference was observed in arterial bicarbonate (19.5 ± .7 to 19.7 ± .8, p = .90). Additionally, the change in pHi observed with enalaprilat correlated with predrug intramucosal-arterial Pco2 difference (r = .74, r2 = .55, p = .0005). These results demonstrate that enalaprilat improves gut perfusion as measured by gastric tonometry in critically injured patients, and that this effect appears to be independent of changes in systemic perfusion.
Tissue Engineering Part A | 2009
Dong Joon Lee; Julie Steen; James E. Jordan; Edward H. Kincaid; Neal D. Kon; Anthony Atala; Joel L. Berry; James J. Yoo
Although calcification remains as the main clinical concern associated with bioprosthetic heart valve replacement surgery, there is evidence that tissue deterioration leads to thromboembolism. In such instances, measures that prevent thrombosis may be beneficial. To minimize thrombosis, endothelialization of the valve surface before implantation has been proposed to facilitate coverage. In this study we aimed to define the optimal flow parameters for the endothelialization of decellularized heart valves using endothelial progenitor cell (EPC)-derived endothelial cells (ECs). We assessed the thrombogenic characteristics of the endothelialized heart valve surface using a bioreactor. EPC-derived ECs were seeded on decellularized porcine valve scaffolds. A computer-controlled bioreactor system was used to determine the optimal flow rates. Successful endothelialization was achieved by preconditioning the cell-seeded valves with stepwise increases in volume flow rate up to 2 L/min for 7 days. We show that decellularized valve scaffolds seeded with EPC-derived ECs improved the anti-thrombotic properties of the valve, whereas the scaffolds without ECs escalated the coagulation process. This study demonstrates that preconditioning of ECs seeded on valve matrices using a bioreactor system is necessary for achieving uniform endothelialization of valve scaffolds, which may reduce thrombotic activity after implantation in vivo.
Echocardiography-a Journal of Cardiovascular Ultrasound and Allied Techniques | 2010
Leanne Groban; David Sanders; Timothy T. Houle; Benjamin L. Antonio; Edi C. Ntuen; David A. Zvara; Neal D. Kon; Edward H. Kincaid
Background: The tissue Doppler‐derived surrogate for left ventricular diastolic pressure, E/e′, has been used to prognosticate outcome in a variety of cardiovascular conditions. In this study, we determined the relationship of intraoperative E/e′ to the use of inotropic support, duration of mechanical ventilation (MV), length of intensive care unit stay (ICU‐LOS), and total hospital stay (H‐LOS) in patients requiring cardiac surgery. The records of 245 consecutive patients were retrospectively reviewed to obtain 205 patients who had intraoperative transesophageal echocardiography examinations prior to coronary artery bypass grafting and/or valvular surgery. Cox proportional hazards and logistic regression models were used to analyze the relation between intraoperative E/e′ or LVEF and early postoperative morbidity (H‐LOS, ICU‐LOS, and MV) and the probability that a patient would require inotropic support. With adjustments for other predictors (female gender, hypertension, diabetes, history of myocardial infarction, emergency surgery, renal failure, procedure type, and length of aortic cross‐clamp time), an elevated E/e′ ratio (≥8) was significantly associated with an increased ICU‐LOS (49 versus 41 median h, P = 0.037) and need for inotropic support (P = 0.002) while baseline LVEF was associated with inotropic support alone (P < 0.0001). These data suggest that the tissue Doppler‐derived index of left ventricular diastolic filling pressure may be a useful indicator for predicting early morbid events after cardiac surgery, and may even provide additional information from that of baseline LVEF. Further, patients with elevated preoperative E/e′ may need more careful peri‐ and postoperative management than those patients with E/e′ <8. (Echocardiography 2010;27:131‐138)
Journal of Trauma-injury Infection and Critical Care | 2001
Edward H. Kincaid; J. Wayne Meredith; Michael C. Chang
BACKGROUND While the right ventricular end-diastolic volume index (RVEDVI) has been shown to be a better indicator of preload than cardiac filling pressures, optimal values during resuscitation from trauma are unknown. This study examines right ventricular stiffness as a guide to optimal values of RVEDVI. METHODS Prospective study of 19 critically injured patients monitored with a volumetric pulmonary artery catheter during resuscitation. Per resuscitation protocol, the target RVEDVI was > or = 120 mL/m2. Sequential fluid boluses of 500 to 1000 mL were administered to obtain at least four values of RVEDVI and right ventricular end-diastolic pressure (estimated by central venous pressure [CVP]). For each patient, nonlinear regression was used to construct the ventricular compliance curve based on the equation, CVP = aek(RVEDVI), where k is the coefficient of chamber stiffness. RESULTS Overall, the derived compliance curves had excellent fit with the theoretical equation (mean R2, 0.95 +/- 0.04). Mean k was 0.043 +/- 0.012 (range, 0.029-0.067). For each patient, mean RVEDVI during resuscitation was significantly correlated with k (R2 = 0.75, p < 10-5) indicating that chamber stiffness, measured during initial fluid administration, may be used to determine RVEDVI during the ensuing resuscitation. CONCLUSION In critically injured patients, bedside assessment of right ventricular compliance is possible and may help determine optimal values of RVEDVI during resuscitation.