Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Herwig-Ulf Meier-Kriesche is active.

Publication


Featured researches published by Herwig-Ulf Meier-Kriesche.


American Journal of Transplantation | 2004

Lack of improvement in renal allograft survival despite a marked decrease in acute rejection rates over the most recent era.

Herwig-Ulf Meier-Kriesche; Jesse D. Schold; Titte R. Srinivas; Bruce Kaplan

Acute rejection is known to have a strong impact on graft survival. Many studies suggest that very low acute rejection rates can be achieved with current immunosuppressive protocols. We wanted to investigate how acute rejection rates have evolved on a national level in the U.S. and how this has impacted graft survival in the most recent era of kidney transplantation. For this purpose, we analyzed data provided by the Scientific Registry of Transplant Recipients regarding all adult first renal transplants between 1995 and 2000.


American Journal of Transplantation | 2004

Long‐Term Renal Allograft Survival: Have we Made Significant Progress or is it Time to Rethink our Analytic and Therapeutic Strategies?

Herwig-Ulf Meier-Kriesche; Jesse D. Schold; Bruce Kaplan

Impressive renal allograft survival improvement between 1988 and 1995 has been described using projections of half‐lives based on limited actual follow up. We aimed, now with sufficient follow up available to calculate real half‐lives.


American Journal of Transplantation | 2005

Immunosuppression: Evolution in practice and trends, 1993-2003

Herwig-Ulf Meier-Kriesche; S. Li; Rainer W. G. Gruessner; John J. Fung; R. T. Bustami; Mark L. Barr; Alan B. Leichtman

Over the last 10 years, there have been important changes in immunosuppression management and strategies for solid‐organ transplantation, characterized by the use of new immunosuppressive agents and regimens. An organ‐by‐organ review of OPTN/SRTR data showed several important trends in immunosuppression practice. There is an increasing trend toward the use of induction therapy with antibodies, which was used for most kidney, pancreas after kidney (PAK), simultaneous pancreas‐kidney (SPK) and pancreas transplant alone (PTA) recipients in 2004 (72–81%) and for approximately half of all intestine, heart and lung recipients. The highest usage of the tacrolimus/mycophenolate mofetil combination as discharge regimen was reported for SPK (72%) and PAK (64%) recipients. Maintenance of the original discharge regimen through the first 3 years following transplantation varied significantly by organ and drug. The usage of calcineurin inhibitors for maintenance therapy was characterized by a clear transition from cyclosporine to tacrolimus. Corticosteroids were administered to the majority of patients; however, steroid‐avoidance and steroid‐withdrawal protocols have become increasingly common. The percentage of patients treated for acute rejection during the first year following transplantation has continued to decline, reaching 13% for those who received a kidney in 2003, 48% of which cases were treated with antibodies.


American Journal of Transplantation | 2011

Long-Term Renal Allograft Survival in the United States: A Critical Reappraisal

Kenneth E. Lamb; Sundus A. Lodhi; Herwig-Ulf Meier-Kriesche

Renal allograft survival has increased tremendously over past decades; this has been mostly attributed to improvements in first‐year survival. This report describes the evolution of renal allograft survival in the United States where a total of 252 910 patients received a single‐organ kidney transplant between 1989 and 2009. Half‐lives were obtained from the Kaplan–Meier and Cox models. Graft half‐life for deceased‐donor transplants was 6.6 years in 1989, increased to 8 years in 1995, then after the year 2000 further increased to 8.8 years by 2005. More significant improvements were made in higher risk transplants like ECD recipients where the half‐lives increased from 3 years in 1989 to 6.4 years in 2005. In low‐risk populations like living‐donor‐recipients half‐life did not change with 11.4 years in 1989 and 11.9 years in 2005. First‐year attrition rates show dramatic improvements across all subgroups; however, attrition rates beyond the first year show only small improvements and are somewhat more evident in black recipients. The significant progress that has occurred over the last two decades in renal transplantation is mostly driven by improvements in short‐term graft survival but long‐term attrition is slowly improving and could lead to bigger advances in the future.


Transplantation | 2002

The impact of body mass index on renal transplant outcomes: a significant independent risk factor for graft failure and patient death.

Herwig-Ulf Meier-Kriesche; J.A Arndorfer; Bruce Kaplan

INTRODUCTION Renal transplant recipients with elevated body mass index (BMI) have been shown to have inferior patient survival as compared to patients with lower BMI. However, previous studies could not establish a link between increased BMI and decreased death censored graft survival. Obesity in nontransplant patients has been associated with hypertension, hyperlipidemia, type II diabetes, proteinuria and glomerulopathy. Given this evidence it is possible that renal transplant recipients with an elevated BMI may have worse long term graft survival. To investigate this hypothesis we retrospectively analyzed 51,927 primary, adult renal transplants registered in the USRDS. METHODS BMI at date of transplant was calculated for all patients using BMI=body weight (in kg)=.stature (height, in meters) squared. BMI values were further categorized into 11 categories: below 18, from 18 to 36 at 2 unit increments, and above 36 kg/m2. Primary study end points were graft and patient survival. Secondary study end points were death censored graft survival, chronic allograft failure, delayed graft function, and acute rejection (AR). Cox proportional hazard and logistic regression models investigated the link between categorized BMI and the study end points correcting for potential confounding variables. RESULTS BMI showed a very strong association with outcomes after renal transplantation. The extremes of very high and very low BMI were associated with significantly worse patient and graft survival. The same was true for death censored graft survival and chronic allograft failure. Elevated BMI was also associated with an increased risk for delayed graft function while lower BMI was significantly protective. Acute rejection did not show any significant association with BMI. CONCLUSIONS BMI has a very strong association with outcomes after renal transplantation independent of most of the known risk factors for patient and graft survival. The extremes of very high and very low BMI before renal transplantation are important risk factors for patient and graft survival. It is important to note that elevated BMI was significantly associated with worse graft survival independent of patient survival. Whether prospective weight adjustment before renal transplantation can favorably affect posttransplant risk needs to be assessed by further studies.


Transplantation | 1998

Immunosuppressive effects and safety of a sirolimus/cyclosporine combination regimen for renal transplantation

Barry D. Kahan; Jeanette M. Podbielski; Kimberly L. Napoli; Stephen M. Katz; Herwig-Ulf Meier-Kriesche; Charles T. Van Buren

Background. Sirolimus, a novel immunosuppressant that inhibits cytokine-driven cell proliferation and maturation, prolongs allograft survival in animal models. After a phase I trial in stable renal transplant recipients documented that cyclosporine and sirolimus have few overlapping toxicities, we conducted an open-label, single-center, phase I/II dose-escalation trial to examine the safety and efficacy of this drug combination. Methods. Forty mismatched living-donor renal transplant recipients were sequentially assigned to receive escalating initial doses of sirolimus (0.5-7.0 mg/m 2 /day), in addition to courses of prednisone and a concentration-controlled regimen of cyclosporine. We conducted surveillance for drug-induced side effects among sirolimus-treated patients and compared their incidence of acute rejection episodes as well as mean laboratory values with those of a historical cohort of 65 consecutive, immediately precedent, demographically similar recipients treated with the same concentration-controlled regimen of cyclosporine and tapering doses of prednisone. Results. The addition of sirolimus reduced the overall incidence of acute allograft rejection episodes to 7.5% from 32% in the immediately precedent cyclosporine/prednisone-treated patients. At 18- to 47-month follow-up periods, both treatment groups displayed similar rates of patient and graft survival, as well as morbid complications. Although sirolimus-treated patients displayed comparatively lower platelet and white blood cell counts and higher levels of serum cholesterol and triglycerides, sirolimus did not augment the nephrotoxic or hypertensive proclivities of cyclosporine. The degree of change in the laboratory values was more directly associated with whole blood trough drug concentrations than with doses of sirolimus. Conclusions. Sirolimus potentiates the immunosuppressive effects of a cyclosporine-based regimen by reducing the rate of acute rejection episodes.


Transplantation | 2003

Decreased renal function is a strong risk factor for cardiovascular death after renal transplantation

Herwig-Ulf Meier-Kriesche; Rajendra Baliga; Bruce Kaplan

Background. Chronic kidney disease is thought to be a potential risk factor for cardiovascular death. In renal-allograft recipients, cardiovascular disease is the most significant cause of death. The purpose of this study was to investigate if renal function has a significant role in determining the risk for cardiovascular death in renal-allograft recipients. Methods. We analyzed 58,900 adult patients registered in the United States Renal Data System who received a primary renal transplant between 1988 and 1998 and who had at least 1 year of graft survival. The primary study endpoint was death from a cardiovascular event beyond 1 year of transplantation. Secondary endpoints were death caused by infections and malignancy-related deaths. Cox proportional-hazard models were used to estimate the effect of renal function on cardiovascular death, infectious death, and malignancy-related death while correcting for potential confounding variables, such as donor and recipient age, gender, race, cause of end-stage renal disease, length of dialysis before transplantation, year of transplantation, donor source and age, delayed graft function, and immunosuppressive regimen. Results. Serum creatinine values at 1 year after transplantation were strongly associated with the risk for cardiovascular death. Above a serum creatinine value of 1.5 mg/dL, there was a significant and progressive increase in the risk for cardiovascular death. The risk of cardiovascular death was significantly higher when patients who lost allograft function were included in the analysis. There was an association between worsening renal function and infectious death, but there was no association between renal function and malignancy-related death. Conclusion. Serum creatinine at 1 year is strongly associated with the incidence of cardiovascular death independent of known risk factors.


American Journal of Transplantation | 2004

Kidney Transplantation Halts Cardiovascular Disease Progression in Patients with End-Stage Renal Disease

Herwig-Ulf Meier-Kriesche; Jesse D. Schold; Titte R. Srinivas; Alan I. Reed; Bruce Kaplan

Morbidity and mortality from cardiovascular disease have a devastating impact on patients with chronic kidney disease (CKD) and end‐stage renal disease. Renal function decline in itself is thought to be a strong risk factor for cardiovascular disease (CVD). In this study, we investigated the hypothesis that the elevated CV mortality in kidney transplant patients is due to the preexisting CVD burden and that restoring renal function by a kidney transplant might over time lower the risk for CVD. We analyzed 60 141 first‐kidney‐transplant patients registered in the USRDS from 1995 to 2000 for the primary endpoint of cardiac death by transplant vintage and compared these rates to all 66813 adult kidney wait listed patients by wait listing vintage, covering the same time period. The CVD rates peaked during the first 3 months following transplantation and decreased subsequently by transplant vintage when censoring for transplant loss. This trend could be shown in living and deceased donor transplants and even in patients with end‐stage renal disease secondary to diabetes. In contrast, the CVD rates on the transplant waiting list increased sharply and progressively by wait listing vintage. Despite the many mechanisms that may be in play, the enduring theme underlying rapid progression of atherosclerosis and cardiovascular disease in renal failure is the loss of renal function. The data presented in this paper thus suggest that the development or progression of these lesions could be ameliorated by restoring renal function with a transplant.


American Journal of Transplantation | 2011

Solid Organ Allograft Survival Improvement in the United States: The Long‐Term Does Not Mirror the Dramatic Short‐Term Success

Sundus A. Lodhi; Kenneth E. Lamb; Herwig-Ulf Meier-Kriesche

Organ survival in the short‐term period post‐transplant has improved dramatically over the past few decades. Whether this has translated to a long‐term survival benefit remains unclear. This study quantifies the progression of nonrenal solid organ transplant outcomes from 1989 to 2009 in liver, lung, heart, intestine and pancreas transplants. Long‐term graft survival was analyzed using data on adult solid organ transplant recipients from the UNOS/SRTR database and is reported as organ half‐life and yearly attrition rates. Liver, lung, heart, intestine and pancreas half‐lives have improved from 5.8 to 8.5, 1.7 to 5.2, 8.8 to 11, 2.1 to 3.6 and 10.5 to 16.7 years, respectively. Long‐term attrition rates have not shown the same consistent improvement, with the yearly attrition rate 5–10 years post‐transplant for liver, lung, heart and pancreas changing from 4.7 to 4.3, 10.9 to 10.1, 6.4 to 5.1 and 3.3 to 4.2, respectively. Attrition rates for intestine and pancreas transplantation alone display more variability due to smaller sample size but exhibit similar trends of improved first‐year attrition and relatively stagnant long‐term attrition rates. With first‐year survival and attrition rates almost at a pinnacle, further progress in long‐term survival will come from targeting endpoints beyond first‐year rejection and survival rates.


The New England Journal of Medicine | 2016

Belatacept and Long-Term Outcomes in Kidney Transplantation

Flavio Vincenti; Lionel Rostaing; Joseph Grinyo; Kim Rice; Steven Steinberg; Luis Gaite; Marie-Christine Moal; Guillermo A. Mondragon-Ramirez; Jatin Kothari; Martin S. Polinsky; Herwig-Ulf Meier-Kriesche; Stephane Munier; Christian P. Larsen

BACKGROUND In previous analyses of BENEFIT, a phase 3 study, belatacept-based immunosuppression, as compared with cyclosporine-based immunosuppression, was associated with similar patient and graft survival and significantly improved renal function in kidney-transplant recipients. Here we present the final results from this study. METHODS We randomly assigned kidney-transplant recipients to a more-intensive belatacept regimen, a less-intensive belatacept regimen, or a cyclosporine regimen. Efficacy and safety outcomes for all patients who underwent randomization and transplantation were analyzed at year 7 (month 84). RESULTS A total of 666 participants were randomly assigned to a study group and underwent transplantation. Of the 660 patients who were treated, 153 of the 219 patients treated with the more-intensive belatacept regimen, 163 of the 226 treated with the less-intensive belatacept regimen, and 131 of the 215 treated with the cyclosporine regimen were followed for the full 84-month period; all available data were used in the analysis. A 43% reduction in the risk of death or graft loss was observed for both the more-intensive and the less-intensive belatacept regimens as compared with the cyclosporine regimen (hazard ratio with the more-intensive regimen, 0.57; 95% confidence interval [CI], 0.35 to 0.95; P=0.02; hazard ratio with the less-intensive regimen, 0.57; 95% CI, 0.35 to 0.94; P=0.02), with equal contributions from the lower rates of death and graft loss. The mean estimated glomerular filtration rate (eGFR) increased over the 7-year period with both belatacept regimens but declined with the cyclosporine regimen. The cumulative frequencies of serious adverse events at month 84 were similar across treatment groups. CONCLUSIONS Seven years after transplantation, patient and graft survival and the mean eGFR were significantly higher with belatacept (both the more-intensive regimen and the less-intensive regimen) than with cyclosporine. (Funded by Bristol-Myers Squibb; ClinicalTrials.gov number, NCT00256750.).

Collaboration


Dive into the Herwig-Ulf Meier-Kriesche's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Titte R. Srinivas

Medical University of South Carolina

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yves Vanrenterghem

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge