Carl E. Haisch
East Carolina University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Carl E. Haisch.
Transplantation | 2013
Matthew J. Everly; Lorita M. Rebellato; Carl E. Haisch; Miyuki Ozawa; K. Parker; Kimberly P. Briley; Paul G. Catrou; Paul Bolin; W. Kendrick; S. Kendrick; Robert C. Harland; Paul I. Terasaki
Background To date, limited information is available describing the incidence and impact of de novo donor-specific anti–human leukocyte antigen (HLA) antibodies (dnDSA) in the primary renal transplant patient. This report details the dnDSA incidence and actual 3-year post-dnDSA graft outcomes. Methods The study includes 189 consecutive nonsensitized, non-HLA-identical patients who received a primary kidney transplant between March 1999 and March 2006. Protocol testing for DSA via LABScreen single antigen beads (One Lambda) was done before transplantation and at 1, 3, 6, 9, and 12 months after transplantation then annually and when clinically indicated. Results Of 189 patients, 47 (25%) developed dnDSA within 10 years. The 5-year posttransplantation cumulative incidence was 20%, with the largest proportion of patients developing dnDSA in the first posttransplantation year (11%). Young patients (18–35 years old at transplantation), deceased-donor transplant recipients, pretransplantation HLA (non-DSA)–positive patients, and patients with a DQ mismatch were the most likely to develop dnDSA. From DSA appearance, 9% of patients lost their graft at 1 year. Actual 3-year death-censored post-dnDSA graft loss was 24%. Conclusion We conclude that 11% of the patients without detectable DSA at transplantation will have detectable DSA at 1 year, and over the next 4 years, the incidence of dnDSA will increase to 20%. After dnDSA development, 24% of the patients will fail within 3 years. Given these findings, future trials are warranted to determine if treatment of dnDSA-positive patients can prevent allograft failure.
American Journal of Transplantation | 2007
Quanzong Mao; Paul I. Terasaki; Junchao Cai; Kimberly P. Briley; Paul G. Catrou; Carl E. Haisch; Lorita M. Rebellato
Longitudinal studies were conducted over a five‐year period for HLA antibodies on 493 sera tested from 54 kidney transplant patients. HLA single antigen beads were employed to establish donor specificity of the antibodies. Only 3 of 22 patients without antibodies rejected a graft in contrast to 17 out of 32 patients with posttransplant antibodies (p = 0.003). Using a serum creatinine value of 4.0 mg/dL as the cut‐off for a failed graft, 4 of 22 patients without antibodies failed compared to 21 of 32 with antibodies (p = 0.0006). Among patients with donor‐specific antibodies (DSA) 13 of 15 failed (p = 0.000004). Even among patients with non‐donor specific antibodies (NDSA), 8 of 17 failed (p = 0.05). Among patients who could be identified as making de novo antibodies (since they developed antibodies while not having antibodies for more than six months after transplantation), 6 of 11 failed (p = 0.03). Sequential testing for HLA antibodies shows that antibodies appear prior to a rise in serum creatinine and subsequent graft failure. The very strong association between the production of HLA antibodies after transplantation and graft failure indicates the importance of monitoring for posttransplant HLA antibodies.
Transplantation | 1994
Judith M. Thomas; Carver Fm; Kasten-Jolly J; Carl E. Haisch; Lorita M. Rebellato; Gross U; Vore Sj; Francis T. Thomas
Infusing the DR-/dim fraction of bone marrow cells (BMC) from an allogeneic kidney donor into rabbit antithymocyte globulin-treated transplant recipients delivers a tolerogenic signal, leading to functional allograft tolerance in rhesus monkeys without additional drug therapy. Our updated results in an expanded series show a median 131-day graft survival of recipients given DR-/dim donor BMC with a 23% 1-year survival (P < 0.00001 vs. rabbit antithymocyte globulin controls). Removing DRbright cells from donor BMC appeared to have a significant effect (P < 0.05). We have further investigated the tolerogenic mechanism within the experimental framework of the veto hypothesis in this preclinical model. In limiting dilution assays, we demonstrated the donor specificity of clonal inactivation of CTL precursors (CTLp) after in vitro or in vivo exposure to DR-/dim donor BMC, confirming specific tolerance. Additionally, in vitro studies confirmed the allogeneic specificity of CTLp inactivation in 3-cell MLR assays; minimal bystander effects were seen on normal CTLp responses to third party stimulator cells, while CTLp responses to the BMC donors cells were abrogated in the same cultures. BMC mediating the veto effect were found to be resistant to L-leucyl-L-leucine methyl ester (Leu-leu-OMe), which excluded BMC-mediated cytotoxicity by NK or lymphokine-activated killer cells, CTL, or activated macrophages. In contrast, veto activity was abolished if the BMC were pretreated with either high dose UV-B light irradiation, mitomycin, or gamma-irradiation, indicating that BMC contained a UV-B-sensitive precursor of the veto effector, and that a proliferative step separated the two. Irradiation of DR-/dim donor BMC or administration of cyclophosphamide after infusion of nonirradiated BMC prevented the tolerogenic effect. Only recipients given nonirradiated DR-/dim donor BMC demonstrated PBL chimerism, which associated with functional deletion of antidonor CTLp and duration of graft survival. The Leu-leu-OMe resistance and the other properties of the allogeneic monkey CD3- CD2+ CD8+ BMC subpopulation that exhibits tolerance-promoting activity in vitro and in vivo lead us to postulate that a donor BMC-derived precursor population, possibly a dendritic cell population, may induce allogeneic unresponsiveness in this model.
Transplantation | 1997
Judith M. Thomas; David M. Neville; Juan L. Contreras; Devin E. Eckhoff; Gang Meng; Andrew L. Lobashevsky; Pei X. Wang; Zhi Q. Huang; Kathryn M. Verbanac; Carl E. Haisch; Francis T. Thomas
A major challenge in clinical transplantation today is to design a practical and effective protocol for tolerance induction compatible with cadaver organ transplantation. A preclinical rhesus monkey kidney allograft model using immediate peritransplant anti-CD3 immunotoxin (anti-CD3-IT) and donor bone marrow (DBM) is shown here to induce operational tolerance with prolonged graft survival in the absence of chronic immunosuppressive drugs. Bone marrow harvested from the kidney donor was depleted of mature alloantigen-presenting cells and T cells by removing DR(bright) cells and CD3(bright) cells, respectively. In outbred, major histocompatibility complex-incompatible donor-recipient pairs with high pretransplant mixed lymphocyte response and cytotoxic T lymphocyte precursor activity, four of six allografts survived for periods of 120 days to >1.5 years. Graft acceptance after peritransplant treatment followed robust elimination of both peripheral blood T cells and lymph node T cells. In most recipients given anti-CD3-IT and DBM infusion, anti-donor immunoglobulin G responses were completely inhibited. Microchimerism was observed in all recipients studied, including those not given DBM, but levels of microchimerism did not correlate with graft survival. Anti-CD3-IT induction in combination with modified DBM protocols such as the depletion of mature T cells and DR(bright) antigen-presenting cells may offer new opportunities to improve clinical tolerance protocols beyond those attempted in the clinic to date. Overall, these results with anti-CD3-IT show promise for development of cadaver transplant tolerance induction.
Transplantation | 2013
Maria Cecilia S. Freitas; Lorita M. Rebellato; Miyuki Ozawa; Anh Nguyen; Nori Sasaki; Matthew J. Everly; Kimberly P. Briley; Carl E. Haisch; Paul Bolin; K. Parker; W. Kendrick; S. Kendrick; Robert C. Harland; Paul I. Terasaki
Background Anti–HLA-DQ antibodies are the predominant HLA class II donor-specific antibodies (DSAs) after transplantation. Recently, de novo DQ DSA has been associated with worse allograft outcomes. The aim of this study was to determine the further complement-binding characteristics of the most harmful DQ DSA. Methods Single-antigen bead technology was used to screen 284 primary kidney transplant recipients for the presence of posttransplantation DQ DSA. Peak DSA sera of 34 recipients with only de novo DQ DSA and of 20 recipients with de novo DQ plus other DSAs were further analyzed by a modified single-antigen bead assay using immunoglobulin (Ig)-G subclass-specific reporter antibodies and a C1q-binding assay. Results Compared with recipients who did not have DSA, those with de novo persistent DQ-only DSA and with de novo DQ plus other DSAs had more acute rejection (AR) episodes (22%, P=0.005; and 36%, P=0.0009), increased risk of allograft loss (hazards ratio, 3.7, P=0.03; and hazards ratio, 11.4, P=0.001), and a lower 5-year allograft survival. De novo DQ-only recipients with AR had more IgG1/IgG3 combination and C1q-binding antibodies (51%, P=0.01; and 63%, P=0.001) than patients with no AR. Furthermore, the presence of C1q-binding de novo DQ DSA was associated with a 30% lower 5-year allograft survival (P=0.003). Conclusions The presence of de novo persistent, complement-binding DQ DSA negatively impacts kidney allograft outcomes. Therefore, early posttransplantation detection, monitoring, and removal of complement-binding DQ might be crucial for improving long-term kidney transplantation outcomes.
Transplantation | 2002
Lauren Brasile; Bart M. Stubenitsky; Maurits H. Booster; Susanne L. Lindell; Dorian Araneda; Corinne Buck; John F. Bradfield; Carl E. Haisch; Gauke Kootstra
BACKGROUND The ability to effectively utilize kidneys damaged by severe (2 hr) warm ischemia (WI) could provide increased numbers of kidneys for transplantation. The present study was designed to examine the effect of restoring renal metabolism after severe WI insult during ex vivo warm perfusion using an acellular technology. After warm perfusion for 18 hr, kidneys were reimplanted and evaluated for graft function. METHODS Using a canine autotransplant model, kidneys were exposed to 120 min of WI. They were then either reimplanted immediately, hypothermically machine perfused (4 degrees C) for 18 hr with Belzers solution, or transitioned to 18 hr of warm perfusion (32 degrees C) with an acellular perfusate before implantation. RESULTS Warm perfused kidneys with 120 min of WI provided life-sustaining function after transplantation, whereas the control kidneys immediately reimplanted or with hypothermic machine perfusion did not. The mean peak serum creatinine in the warm perfused kidneys was 3.7 mg/dl, with the mean peak occurring on day 2 and normalizing on day 9 posttransplant. CONCLUSIONS These results indicate that 18 hr of ex vivo warm perfusion of kidneys is feasible. Furthermore, recovery of renal function during warm perfusion is demonstrated, resulting in immediate function after transplantation. The use of ex vivo warm perfusion to recover function in severe ischemically damaged kidneys could provide the basis for increasing the number of transplantable kidneys.
American Journal of Transplantation | 2013
M. Taniguchi; Lorita M. Rebellato; Junchao Cai; J. Hopfield; Kimberly P. Briley; Carl E. Haisch; Paul G. Catrou; Paul Bolin; K. Parker; W. Kendrick; S. Kendrick; Robert C. Harland; Paul I. Terasaki
Reports have associated non‐HLA antibodies, specifically those against angiotensin II type‐1 receptor (AT1R), with antibody‐mediated kidney graft rejection. However, association of anti‐AT1R with graft failure had not been demonstrated. We tested anti‐AT1R and donor‐specific HLA antibodies (DSA) in pre‐ and posttransplant sera from 351 consecutive kidney recipients: 134 with biopsy‐proven rejection and/or lesions (abnormal biopsy group [ABG]) and 217 control group (CG) patients. The ABGs rate of anti‐AT1R was significantly higher than the CGs (18% vs. 6%, p < 0.001). Moreover, 79% of ABG patients with anti‐AT1R lost their grafts (vs. 0%, CG), anti‐AT1R levels in 58% of those failed grafts increasing posttransplant. With anti‐AT1R detectable before DSA, time to graft failure was 31 months—but 63 months with DSA detectable before anti‐AT1R. Patients with both anti‐AT1R and DSA had lower graft survival than those with DSA alone (log‐rank p = 0.007). Multivariate analysis showed that de novo anti‐AT1R was an independent predictor of graft failure in the ABG, alone (HR: 6.6), and in the entire population (HR: 5.4). In conclusion, this study found significant association of anti‐AT1R with graft failure. Further study is needed to establish causality between anti‐AT1R and graft failure and, thus, the importance of routine anti‐AT1R monitoring and therapeutic targeting.
Journal of Trauma-injury Infection and Critical Care | 2001
Randolph Wojcik; Mark D. Cipolle; Elizabeth Seislove; Thomas Wasser; Michael D. Pasquale; Carl E. Haisch
OBJECTIVE The objective of this study was to determine whether the preinjury condition of anticoagulation had an adverse impact on patients sustaining injury. METHODS A retrospective analysis was performed for prospectively collected registry data from 1995-2000 from all accredited trauma centers in Pennsylvania. The registry was queried for all trauma patients who had anticoagulation therapy as a preinjury condition (PIC). This group served as our experimental cohort. A control cohort (not having warfarin therapy as a PIC) was developed using case-matching techniques for age, sex, Glasgow Coma Scale (GCS), Injury Severity Score (ISS), A Severity Characterization of Trauma (ASCOT) score, and in the head injured patients, International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnoses. Head and non-head injured patients were evaluated separately. The cohorts were examined for 28-day mortality, intensive care unit length of stay (ICU-LOS), hospital length of stay (HOS-LOS), PICs, occurrences, discharge destinations, and functional status at discharge. Chi2 and Students t test were used to evaluate the data; p values < 0.05 were considered significant. RESULTS Two thousand nine hundred forty-two patients were available for analysis. The prevalence of PICs was significantly greater in the warfarin group for both the head and non-head injured populations (p < 0.003 and p < 0.0001, respectively). The incidence of occurrences in the non-head injured population was statistically higher for the warfarin patients (p < 0.001), but showed no difference in the head injured group regardless of warfarin use (p = 0.15). Functional status at discharge demonstrated no clinically significant difference between the warfarin and non-warfarin groups in both head and non-head injured populations. There was no difference in discharge destination in the head injured population; however, in the non-head injured population a greater percentage of non-warfarin patients was discharged to home when compared with the warfarin patients. CONCLUSION Our data suggest that the PIC of anticoagulation with warfarin does not adversely impact mortality or LOS outcomes in both head and non-head injured patients. In non-head injured patients, however, the occurrence rates and discharge destination were different. More research needs to be done to determine whether this is related to anticoagulation or other reasons (i.e., number of PICs). These data should be used when weighing risk/benefit ratios of prescribing chronic anticoagulation.
American Journal of Transplantation | 2001
Lauren Brasile; Bart M. Stubenitsky; Maurits H. Booster; Dorian Arenada; Carl E. Haisch; Gauke Kootstra
A study was performed to determine the limiting factors to expanding the donor pool with warm ischemically (WI) damaged kidneys. Canine kidneys were damaged by 30 min of WI, and then either cold stored (CS) in ViaSpan (4 °C) for 18 h, or warm perfused with exsanguineous metabolic support (EMS) technology (32 °C) for 18 h, or subjected to combinations of both techniques. The kidneys were autotransplanted with contralateral nephrectomy. In kidneys with WI and CS alone, the mean peak serum creatinine value was 6.3 mg/dL and took 14 days to normalize. In contrast, kidneys where renal metabolism was resuscitated ex vivo during 18 h of warm perfusion demonstrated mild elevations in the serum chemistries (2.6 mg/dL). The damage in kidneys CS for 18 h was ameliorated with 3 h of subsequent warm perfusion and eliminated by 18 h of warm perfusion. In contrast, reversing the order with CS following WI and 18 h of warm perfusion resulted in a time‐dependent increase in damage. These results identify hypothermia as a major limiting factor to expanding indications for kidney donation. While hypothermia represents the foundation of preservation in the heart‐beating donor, its use in WI damaged organs appears to represent a limiting factor.
Transplantation | 1994
Kathryn M. Verbanac; F. M. Carver; Carl E. Haisch; Judith M. Thomas
We have studied the veto cell-mediated induction of transplant tolerance by allogeneic donor bone marrow cells and have achieved kidney allograft tolerance in a preclinical rhesus monkey model. Here we extend these studies to investigate the veto mechanism of CTLp suppression and the role of CD8 and TGF-beta in these events. Infusion of DR-/dim donor BMC into RATG-treated rhesus monkeys induced functional deletion of donor-specific CTLp and prolongation of kidney allograft survival, whereas depletion of the CD8+ subset from BMC ablated these effects. A role of CD8 in the veto effect was further implicated by rhesus MLR-induced CML experiments in which pretreatment of normal responder cells with MAb to MHC class I, the natural ligand of CD8, blocked the suppressive activity of allogeneic BMC. In addition, pretreatment of the BMC with anti-CD8 MAbs blocked strong veto activity significantly, suggesting that CD8 functions as an accessory or adhesion ligand. In contrast, anti-CD8 treatment significantly enhanced weak BMC-mediated veto activity, suggesting that CD8 might additionally serve as a signal transducer to increase veto activity, perhaps by the induction of cytokine release. The cytokine TGF-beta was studied because it has immunosuppressive properties that are shared by veto cells. Human TGF-beta, like BMC veto cells, inhibited MLR-induced CML in a dose-dependent manner, and anti-TFB-beta Ig relieved the BMC-mediated veto suppressive effect. Active TGF-beta was detected only in the supernatants of CML cultures containing BMC. Pretreatment of BMC with L-leucyl-leucine methyl ester (Leu-leu-OMe), which eliminates cytotoxic precursor and effector lymphocytes and monocytes, did not affect levels of active TGF-beta. In previous studies, the veto effect of BMC was also shown to be Leu-leu-OMe-resistant. Finally, treatment of isolated DR-/dim BMC cultures with anti-CD8 elicited TGF-beta secretion, whereas anti-CD2 or anti-CD3 had no effect. When isolated after stimulation with anti-CD8, only the CD8+ subset of DR-/dim BMC produced detectable levels of active TGF-beta. In summary, these studies demonstrate that CD8 functions as an immunoregulatory molecule in veto effects by freshly isolated rhesus BMC and suggest that CD8-ligand interactions may induce low-level secretion of TGF-beta to mediate or facilitate the veto mechanism of CTLp inactivation in a paracrine manner.