Julie A. Hanson
University of Michigan
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Julie A. Hanson.
Transplantation | 2000
Akinlolu Ojo; Herwig Ulf Meier-Kriesche; Julie A. Hanson; Alan B. Leichtman; Diane M. Cibrik; John C. Magee; Robert A. Wolfe; Lawrence Y. Agodoa; Bruce Kaplan
BACKGROUND Mycophenolate Mofetil (MMF) has been shown to significantly decrease the number of acute rejection episodes in renal transplant recipients during the 1st year. A beneficial effect of MMF on long-term graft survival has been more difficult to demonstrate. This beneficial effect has not been detected, despite the impact of acute rejection on the development of chronic allograft nephropathy and experimental evidence that MMF may have a salutary effect on chronic allograft nephropathy independent of that of rejection. METHODS Data on 66,774 renal transplant recipients from the U.S. renal transplant scientific registry were analyzed. Patients who received a solitary renal transplant between October 1, 1988 and June 30, 1997 were studied. The Cox proportional hazard regression was used to estimate relevant risk factors. Kaplan-Meier analysis was performed for censored graft survival. RESULTS MMF decreased the relative risk for development of chronic allograft failure (CAF) by 27% (risk ratio [RR] 0.73, P<0.001). This effect was independent of its outcome on acute rejection. Censored graft survival using MMF versus azathioprine was significantly improved by Kaplan-Meier analysis at 4 years (85.61% v. 81.9%). The effect of an acute rejection episode on the risk of developing CAF seems to be increasing over time (RR=1.9, 1988-91; RR=2.9, 1992-94; RR=3.7, 1995-97). CONCLUSION MMF therapy decreases the risk of developing CAF. This improvement is only partly caused by the decrease in the incidence of acute rejection observed with MMF; but, is also caused by an effect independent of acute rejection.
Transplantation | 2001
Akinlolu Ojo; Herwig Ulf Meier-Kriesche; Julie A. Hanson; Alan B. Leichtman; John C. Magee; Diane M. Cibrik; Robert A. Wolfe; Friedrich K. Port; Lawrence Y. Agodoa; Dixon B. Kaufman; Bruce Kaplan
Background. Simultaneous pancreas-kidney transplantation (SPK) ameliorates the progression of microvascular diabetic complications but the procedure is associated with excess initial morbidity and an uncertain effect on patient survival when compared with solitary cadaveric or living donor renal transplantation. We evaluated mortality risks associated with SPK, solitary renal transplantation, and dialysis treatment in a national cohort of type 1 diabetics with end-stage nephropathy. Methods. A total of 13,467 adult-type 1 diabetics enrolled on the renal and renal-pancreas transplant waiting list between 10/01/88 and 06/30/97 were followed until 06/30/98. Time-dependent mortality risks and life expectancy were calculated according to the treatment received subsequent to wait-list registration: SPK; cadaveric kidney only (CAD); living donor kidney only (LKD) transplantation; and dialysis [wait-listed, maintenance dialysis treatment (WLD)]. Results. Adjusted 10-year patient survival was 67% for SPK vs. 65% for LKD recipients (P =0.19) and 46% for CAD recipients (P <0.001). The excess initial mortality normally associated with renal transplantation and the risk of early infectious death was 2-fold higher in SPK recipients. The time to achieve equal proportion of survivors as the WLD patients was 170, 95, and 72 days for SPK, CAD, and LKD recipients, respectively (P <0.001). However, the adjusted 5-year morality risk (RR) using WLD as the reference and the expected remaining life years were 0.40, 0.45, and 0.75 and 23.4, 20.9, and 12.6 years for SPK, LKD, and CAD, respectively. There was no survival benefit in SPK recipients ≥50 years old (RR=1.38, P =0.81). Conclusions. Among patients with type 1 DM with end-stage nephropathy, SPK transplantation before the age of 50 years was associated with long-term improvement in survival compared to solitary cadaveric renal transplantation or dialysis.
Transplantation | 2000
Herwig Ulf Meier-Kriesche; A. O. Ojo; Julie A. Hanson; Diane M. Cibrik; Jeffrey D. Punch; Alan B. Leichtman; Bruce Kaplan
Background. Acute rejection (AR) remains a major risk factor for the development of chronic renal allograft failure (CAF), which is a major cause of late graft loss. With the introduction of several newer immunosuppressive agents (e.g., mycophenolate mofetil, tacrolimus and neoral) acute rejection rates have been steadily decreasing. However, the incidence of CAF has not decreased as dramatically as the incidence of acute rejection. One possible explanation is that the impact of AR on CAF is changing. The goal of this study was to analyze the relative impact of AR era on the development of CAF. Methods. We evaluated 63,045 primary renal transplant recipients reported to the USRDS from 1988 to 1997. CAF was defined as graft loss after 6 months posttransplantation, censored for death, acute rejection, thrombosis, infection, surgical complications, or recurrent disease. A Cox proportional hazard model correcting for 15 possible confounding factors evaluated the relative impact of AR on CAF. The era effect (years 1988–1989, 1990–1991, 1992–1993, 1994–1995 and 1996–1997) was evaluated by an era versus AR interaction term. Results. An AR episode within the first 6 months after transplantation was the most important risk factor for subsequent CAF (RR=2.4, CI 2.3–2.5). Compared with the reference group (1988–89 with no rejection), having an AR episode in 1988–89, 1990–1991, 1992–1993, 1994–1995, and 1996–1997, conferred a 1.67, 2.35, 3.4, 4.98 and 5.2-fold relative risk for the subsequent development of CAF (P <0.001). Conclusions. Independently of known confounding variables, the impact of AR on CAF has significantly increased from 1988 to 1997. This effect may in part explain the relative lack of improvements in long term renal allograft survival, despite a decline in AR rates.
Transplantation | 2000
Herwig Ulf Meier-Kriesche; Akinlolu Ojo; Julie A. Hanson; Diane M. Cibrik; Kathleen D. Lake; Lawrence Y. Agodoa; Alan B. Leichtman; Bruce Kaplan
The results of renal transplantation have improved steadily over the last 10 years. This improvement is in large part attributable to improvements in immunosuppressive pharmacotherapy. Several phase III clinical trials have demonstrated decreased acute rejection rates with relatively little, if any, increase in infectious complications in a broad cross-section of patients (1‐12). It is possible, however, that in certain subpopulations of patients, such as elderly renal transplant recipients, the risk/benefit ratio of acute rejection versus infection may be different from that of the general renal transplant population. If this were the case, the appropriate immunosuppressive regimen for that population should be reconsidered. Renal transplantation is a relatively safe option for renal replacement therapy in elderly patients with end-stage renal disease (ESRD). In 1976, Tersigni et al. (13) reported a series of nine ESRD patients above the age of 60 years successfully treated with renal transplantation. Shortly thereafter, Wedel et al. (14) published a series of 41 geriatric renal transplant recipients, and pointed out that the risk to these patients was not graft loss from rejection but, rather, death with a functioning graft. This author also reported that there was an increased risk for serious infectious complications in the older age group. With the advent of cyclosporine and more selective immunosuppression, the results of renal transplantation improved in high-risk groups, such as elderly patients. In 1989, Pirsch et al (15) concluded that cadaveric renal transplantation with cyclosporine immunosuppression was safe and an effective therapeutic modality in elderly ESRD patients. Subsequent studies have reinforced that concept (16) but have also reinforced the idea that elderly patients have a degree of immune incompetence (17) and require less aggressive immunosuppressive therapy (17). That theory was supported by the continuing observation of lower rejection rates (17, 18) and lower incidence of chronic rejection (18, 19), but higher risk of infections (19, 20) noted in elderly transplant recipients. Reinforcement of the practice of transplantation in elderly patients came from a study that showed significantly greater survival probability in ESRD patients over the age of 60 who received transplants as opposed to matched patients who remained on dialysis (9). Recent data demonstrate a very similar 5-year graft (54 ‐74%) and patient survival (52‐74%), confirming the improvement made but also the concept that most patients in this high-risk group die with functioning grafts (19). In this study, posttransplant morbidity was attributed primarily to infectious complications and an increased prevalence of malignancy (19). We demonstrated previously, in a single-center study, that intensification of immunosuppressive therapy in elderly transplant recipients increased infectious complications without decreasing the incidence of acute rejection or improvement in graft survival (21). Given the above data, it is possible that the vulnerability to immunosuppression of older renal transplant patients may be very different from that of younger patients. To test this hypothesis, we analyzed a large group of renal transplant recipients with regard to their balance of acute rejection versus death due to infection.
Transplantation | 2000
Herwig Ulf Meier-Kriesche; Akinlolu Ojo; Diane M. Cibrik; Julie A. Hanson; Alan B. Leichtman; John C. Magee; Friedrich K. Port; Bruce Kaplan
BACKGROUND The elderly are the fastest growing segment of the end stage renal disease (ERSD) population. Older renal transplant recipients experience fewer acute rejection episodes than do younger patients. Despite this, death censored graft survival is no better in these older transplant recipients than in younger recipients. We examined the United States Renal Data System (USRDS) database to determine whether recipient age itself has an independent effect on the development of chronic allograft failure (CAF). METHODS We analyzed 59,509 patients from the files of the USRDS. To determine whether age was an independent risk factor for CAF, the population was analyzed separately for Caucasians, African-Americans, and other ethnic groups. All renal transplant recipients from 1988 to 1997 were examined. Both univariate and multivariate analysis were performed using chronic allograft failure as the outcome of interest. RESULTS Actuarial 8-year censored graft survival was significantly decreased in the older age groups 67% for ages 18-49 vs. 61.8% for ages 50-64 vs. 50.7% for ages 65+ (P<0.001). In the multivariate analysis, recipient age was a strong and independent risk factor for the development of chronic allograft failure in Caucasians (RR 1.29 for ages 50-64, RR 1.67 for ages older than 65). These findings were reinforced by an analysis that was restricted to living donor transplants without acute rejection. CONCLUSION In Caucasians increased recipient age is an independent risk factor for the development of chronic renal allograft failure.
Transplantation | 2000
A. O. Ojo; Herwig Ulf Meier-Kriesche; Gary Friedman; Julie A. Hanson; Diane M. Cibrik; Alan B. Leichtman; Bruce Kaplan
INTRODUCTION Fabrys disease is an X-linked error of glycosphingolipid metabolism. Clinical manifestations of the disease are secondary to accumulation of glycosphingolipids in various tissues. Renal failure and vascular complications are common. There are conflicting reports regarding the outcomes of patients with Fabrys disease after renal transplantation. METHODS We reviewed the United States Renal Data System Registry database from 1988 and 1998, and found 93 patients with Fabrys disease who had received a renal transplant. Case-matched patients were identified to serve as controls. RESULTS Patients with Fabrys disease demonstrated equivalent 5-year patient and graft survival, compared with controls (83% and 75%, respectively, for those with Fabrys disease vs. 82% and 67% for controls). CONCLUSION Despite their high risk for cardiovascular complications, patients with Fabrys disease have excellent outcomes after renal transplantation.
Transplantation | 2001
Herwig Ulf Meier-Kriesche; Akinlolu Ojo; Julie A. Hanson; Bruce Kaplan
BACKGROUND Hepatitis occurs frequently in patients with end-stage renal disease. In 1997, 0.7% of patients receiving a renal transplant were positive for hepatitis C antibodies. Concern has been raised as to whether these patients are at an increased mortality risk after renal transplantation compared with patients who are hepatitis C antibody negative. To help answer this question, we analyzed data from the United States Renal Data System from October of 1988 through June of 1998. METHODS Primary study endpoints were patient death and death censored graft loss. Secondary study endpoints included cardiovascular, infectious, malignant, and infection-related death. Kaplan-Meier survival estimates as well as Cox proportional hazard models were used to evaluate the impact of hepatitis C antibody status on the study endpoints. RESULTS A total of 73,707 patients were analyzed. Patient survival by Kaplan-Meier analysis was higher in hepatitis C-positive patients, whereas death censored graft survival trended lower in the very long term. By the Cox model, hepatitis C-positive adjusted patient survival is slightly superior to that of hepatitis C-negative patients. CONCLUSIONS Renal transplant recipients who are hepatitis C antibody positive do not have an increased risk of death after transplantation compared with hepatitis C-negative recipients. The current policy of transplanting hepatitis C-positive patients without active liver disease seems to incur no excess mortality risk.
Journal of the American Geriatrics Society | 2002
Herwig Ulf Meier-Kriesche; Diane M. Cibrik; Akinlolu Ojo; Julie A. Hanson; John C. Magee; Steven M. Rudich; Allan B. Leichtman; Bruce Kaplan
OBJECTIVES Donor age is a known risk factor for chronic allograft failure (CAF) in renal transplant recipients. We have recently shown that advanced recipient age is also a risk factor for CAF. To investigate the interaction between donor and recipient age, we analyzed 40,289 primary solitary Caucasian adult renal transplants registered at the United States Renal Data System (USRDS) from 1988 to 1997. DESIGN CAF was defined as allograft loss beyond 6 months posttransplantation, censored for death, recurrent disease, acute rejection, thrombosis, noncompliance, infection, or technical problems. Cox proportional hazards models were used to investigate the risk of allograft loss secondary to CAF. All models were corrected for 15 covariates including donor and recipient demographics, ischemic time, and human leukocyte antigen match. Donor and recipient age were categorized, and relative risk for allograft loss of the interaction between the obtained categorical covariates was evaluated. SETTING Retrospective data analysis using the USRDS. PARTICIPANTS All primary Caucasian renal transplant recipients from 1988 to 1997. RESULTS Patients aged 55 and older who received donor kidneys had a 110% increased risk of CAF (relative risk (RR) = 2.1, 95% confidence interval (CI) = 1.9-2.3, P< .001) and recipients aged 65 and older had a 90% increased risk for CAF (RR = 1.9, 95% CI = 1.61-2.1, P< .001), compared with the youngest reference groups. In addition, there was an additive and, in the long term, synergistic interaction between donor and recipient age in determining allograft loss. CONCLUSIONS Donor and recipient age had an independent, equivalently detrimental effect on renal allograft survival. An overall additive and, in the long term (beyond 36 months posttransplant), synergistic deleterious effect on renal allograft survival was observed for the interaction of donor and recipient age.
Transplantation | 2001
Herwig Ulf Meier-Kriesche; A. O. Ojo; Sean F. Leavey; Julie A. Hanson; Alan B. Leichtman; John C. Magee; Diane M. Cibrik; Bruce Kaplan
Background. Despite the known differences in immunological reactivity between males and females, no differences in graft survival have been described among renal transplant recipients with regard to gender. To address this paradox, we analyzed data from 73,477 primary renal transplants collected in the US Renal Data System database. Methods. Logistic regression and Cox proportional hazard models were used to investigate the primary study end points, graft loss secondary to acute rejection (AR) or chronic allograft failure (CAF). CAF was defined as graft loss beyond 6 months, not attributable to death, recurrent disease, acute rejection, thrombosis, infection, noncompliance, or technical problems. The models adjusted for 15 covariates including immunosuppressive regimen, and donor and recipient characteristics. Results. The overall 8-year graft and patient survivals were significantly better in female renal transplant recipients compared with male recipients. However graft survival censored for death was not significantly different by gender. By multivariate analysis, females had a 10% increased odds of AR (OR=1.10, CI 1.02–1.12), but conversely a 10% lower risk of graft loss secondary to CAF (RR=0.9, CI 0.85–0.96). The risk for CAF increased significantly with increasing age for both males and females, but this effect was greater for males than for females (P <0.001). Conclusion. Although female renal transplant recipients have a similar death censored graft survival compared with males, there are important differences in immunological behavior. Females have a higher risk of AR while having a decreased risk of graft loss secondary to CAF.
Infection and Immunity | 2000
Daniel L. Clemans; Richard J. Bauer; Julie A. Hanson; Monte V. Hobbs; Joseph W. St. Geme; Carl F. Marrs; Janet R. Gilsdorf
ABSTRACT Nontypeable Haemophilus influenzae (NTHi) causes repeated respiratory infections in patients with chronic lung diseases. These infections are characterized by a brisk inflammatory response which results in the accumulation of polymorphonucleated cells in the lungs and is dependent on the expression and secretion of proinflammatory cytokines. We hypothesize that multiple NTHi molecules, including lipooligosaccharide (LOS), mediate cellular interactions with respiratory epithelial cells, leading to the production of proinflammatory cytokines. To address this hypothesis, we exposed 9HTEo− human tracheal epithelial cells to NTHi and compared the resulting profiles of cytokine gene expression and secretion using multiprobe RNase protection assays and enzyme-linked immunosorbent assays (ELISA), respectively. Dose-response experiments demonstrated a maximum stimulation of most cytokines tested, using a ratio of 100 NTHi bacterial cells to 1 9HTEo− tracheal epithelial cell. Compared with purified LOS, NTHi bacterial cells stimulated 3.6- and 4.5-fold increases in epithelial cell expression of interleukin-8 (IL-8) and IL-6 genes, respectively. Similar results were seen with epithelial cell macrophage chemotactic protein 1, IL-1α, IL-1β, and tumor necrosis factor alpha expression. Polymyxin B completely inhibited LOS stimulation but only partially reduced NTHi whole cell stimulation. Taken together, these results suggest that multiple bacterial molecules including LOS contribute to the NTHi stimulation of respiratory epithelial cell cytokine production. Moreover, no correlation was seen between NTHi adherence to epithelial cells mediated by hemagglutinating pili, Hia, HMW1, HMW2, and Hap and epithelial cytokine secretion. These data suggest that bacterial molecules beyond previously described NTHi cell surface adhesins and LOS play a role in the induction of proinflammatory cytokines from respiratory epithelial cells.