Joke I. Roodnat
Erasmus University Rotterdam
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Joke I. Roodnat.
Transplantation | 2001
Joke I. Roodnat; Paul G.H. Mulder; Jacqueline Rischen-Vos; I. C. van Riemsdijk; T. van Gelder; Robert Zietse; Jan N. M. IJzermans; W. Weimar
Background. Proteinuria is associated with an increased risk of renal failure. Moreover, proteinuria is associated with an increased death risk in patients with diabetes mellitus or hypertension and even in the general population. Methods. One year after renal transplantation, we studied the influence of the presence of proteinuria on the risk of either graft failure or death in all 722 recipients of a kidney graft in our center who survived at least 1 year with a functioning graft. Proteinuria was analyzed both as a categorical variable (presence versus absence) and as a continuous variable (quantification of 24 hr urine). Other variables included in this analysis were: donor/recipient age and gender, original disease, race, number of HLA-A and HLA-B mismatches, previous transplants, postmortal or living related transplantation, and transplantation year. At 1 year after transplantation, we included: proteinuria, serum cholesterol, serum creatinine, blood pressure, and the use of antihypertensive medication. Results. In the Cox proportional hazards analysis, proteinuria at 1 year after transplantation (both as a categorical and continuous variable) was an important and independent variable influencing all endpoints. The influence of proteinuria as a categorical variable on graft failure censored for death showed no interaction with any of the other variables. There was an adverse effect of the presence of proteinuria on the graft failure rate (RR=2.03). The influence of proteinuria as a continuous variable showed interaction with original disease. The presence of glomerulonephritis, hypertension, and systemic diseases as the original disease significantly increased the risk of graft failure with an increasing amount of proteinuria at 1 year. The influence of proteinuria as a categorical variable on the rate ratio for patient failure was significant, and there was no interaction with any of the other significant variables (RR=1.98). The death risk was almost twice as high for patients with proteinuria at 1 year compared with patients without proteinuria. The influence of proteinuria as a continuous variable was also significant and also without interaction with other variables. The death risk increased with increasing amounts of proteinuria at 1 year. Both the risks for cardiovascular and for noncardiovascular death were increased. Conclusion. Proteinuria after renal transplantation increases both the risk for graft failure and the risk for death.
American Journal of Transplantation | 2004
Marika A. Artz; Johannes M. M. Boots; Gerry Ligtenberg; Joke I. Roodnat; Maarten H. L. Christiaans; Pieter F. Vos; Philip Moons; George F. Borm; Luuk B. Hilbrands
Long‐term use of cyclosporine after renal transplantation results in nephrotoxicity and an increased cardiovascular risk profile. Tacrolimus may be more favorable in this respect. In this randomized controlled study in 124 renal transplant patients, the effects of conversion from cyclosporine to tacrolimus on renal function, cardiovascular risk factors, and perceived side‐effects were investigated after a follow‐up of 2 years. After conversion from cyclosporine to tacrolimus renal function remained stable, whereas continuation of cyclosporine was accompanied by a rise in serum creatinine from 142 ± 48 μmol/L to 157 ± 62 μmol/L (p < 0.05 comparing both groups). Conversion to tacrolimus resulted in a sustained reduction in systolic and diastolic blood pressure, and a sustained improvement in the serum lipid profile, leading to a reduction in the Framingham risk score from 5.7 ± 4.3 to 4.8 ± 5.3 (p < 0.05). Finally, conversion to tacrolimus resulted in decreased scores for occurrence of and distress due to side‐effects. In conclusion, conversion from cyclosporine to tacrolimus in stable renal transplant patients is beneficial with respect to renal function, cardiovascular risk profile, and side‐effects. Therefore, for most renal transplant patients tacrolimus will be the drug of choice when long‐term treatment with a calcineurin inhibitor is indicated.
Journal of The American Society of Nephrology | 2003
Marika A. Artz; Johannes M. M. Boots; Gerry Ligtenberg; Joke I. Roodnat; Maarten H. L. Christiaans; Pieter F. Vos; Henk J. Blom; Fred Sweep; P.N.M. Demacker; Luuk B. Hilbrands
Cyclosporine is considered to contribute to the high cardiovascular morbidity and mortality in patients after renal transplantation. Tacrolimus may be more favorable in this respect, but controlled data are scarce. In this prospective randomized study in 124 stable renal transplant patients, the effects of conversion from cyclosporine to tacrolimus on cardiovascular risk factors and renal function were investigated. Follow-up was 6 mo. Statistical analysis was performed by ANOVA for repeated measurements. The serum creatinine level decreased from 137 +/- 30 micromol/L to 131 +/- 29 micromol/L (P < 0.01). Three months after conversion from cyclosporine to tacrolimus, mean BP significantly decreased from 104 +/- 13 to 99 +/- 12 mmHg (P < 0.001). Serum LDL cholesterol decreased from 3.48 +/- 0.80 to 3.11 +/- 0.74 mmol/L (P < 0.001,) and serum apolipoprotein B decreased from 1018 +/- 189 to 935 +/- 174 mg/L (P < 0.001). Serum triglycerides decreased from 2.11 +/- 1.12 to 1.72 +/- 0.94 mmol/L (P < 0.001). In addition, both rate and extent of LDL oxidation were reduced. The fibrinogen level decreased from 3638 +/- 857 to 3417 +/- 751 mg/L (P < 0.05). Plasma homocysteine concentration did not change. Three months after conversion, plasma fasting glucose level temporarily increased from 5.4 +/- 1.3 mmol/L to 5.8 +/- 1.9 mmol/L (P < 0.05). Conversion to tacrolimus resulted in a significant reduction of the Framingham risk score. In conclusion, conversion from cyclosporine to tacrolimus in stable renal transplant patients has a beneficial effect on renal function, BP, serum concentration and atherogenic properties of serum lipids, and fibrinogen.
Transplantation | 1999
Joke I. Roodnat; Robert Zietse; Paul G.H. Mulder; Jacqueline Rischen-Vos; T. van Gelder; Jan N. M. IJzermans; W. Weimar
BACKGROUND The growing number of patients awaiting a kidney transplant raises questions about allocation of kidneys to the elderly and about the use of elderly donors. In all reported studies analyzing the influence of age on the outcome after renal transplantation, age is investigated as a categorical variable. METHODS We studied age both as a categorical (Kaplan-Meier) and as a continuous (Cox) variable in a total of 509 cyclosporine-treated recipients of a primary cadaveric kidney graft who underwent transplantation between July 1983 and July 1997. For the Kaplan-Meier analysis, the population was divided into three comparably sized age groups: 17-43 years (n=171), 44-55 years (n=169), and 56-75 years (n=169). RESULTS Patient survival was better and graft survival censored for death was worse in the younger patients. Overall graft survival (end point was death or graft failure) was not significantly influenced by age. In the Cox proportional hazards analysis, transplantation year turned out to be an important, independent variable influencing all end points. Because the influence was not linear, three periods were defined in which the relative risk remained stable: 1983-1990, 1991-1993, and 1994-1997. In the second period, the relative risk for transplant failure or death was 49% of that in the first period. In the third period, the relative risk had decreased to 22% of that in the first period. Recipient age and donor age were significant predictors of overall transplant failure. There was no interaction between these variables and transplantation year. Within each transplantation period, an increase in recipient age by 1 year increased the relative risk for overall graft failure by only 1.44%. The influence of donor age followed a J-shaped curve with a minimum at 30 years. The influence of increasing either recipient or donor age was counteracted by the improving results over time. CONCLUSION Considering the improving results over time, there are, at this moment, no arguments for an age restriction for kidney transplant recipients or donors.
Transplant International | 2006
Jeroen Aalten; Maarten H. L. Christiaans; Hans de Fijter; Ronald J. Hené; Jaap Homan van der Heijde; Joke I. Roodnat; Janto Surachno; Andries J. Hoitsma
To determine short‐ and long‐term patient and graft survival in obese [body mass index (BMI) ≥ 30 kg/m2] and nonobese (BMI < 30 kg/m2) renal transplant patients we retrospectively analyzed our national‐database. Patients 18 years or older receiving a primary transplant after 1993 were included. A total of 1871 patients were included in the nonobese group and 196 in the obese group. In the obese group there were significantly more females (52% vs. 38.6%, P < 0.01) and patients were significantly older [52 years (43–59) vs. 48 years (37–58); P < 0.05]. Patient survival and graft survival were significantly decreased in obese renal transplant recipients (1 and 5 year patient survival were respectively 94% vs. 97% and 81% vs. 89%, P < 0.01; 1 and 5 year graft survival were respectively 86% vs. 92% and 71% vs. 80%, P < 0.01). Initial BMI was an independent predictor for patient death and graft failure. This large retrospective study shows that both graft and patient survival are significantly lower in obese renal transplant recipients.
Transplantation | 2011
Ellen K. Hoogeveen; Jeroen Aalten; Kenneth J. Rothman; Joke I. Roodnat; Marko J.K. Mallat; George F. Borm; Willem Weimar; Andries J. Hoitsma; Johan W. de Fijter
Background. Cardiovascular disease is both a major threat to the life expectancy of kidney transplant recipients and an important determinant of late allograft loss. Obesity is an important risk factor for cardiovascular disease. Methods. We investigated the relation between both pretransplant and 1-year posttransplant body mass index (BMI) with patient and renal graft survival in a cohort of 1810 adult patients. Sixty-one percent of all patients were men; median age (interquartile range [IQR]) was 46 years (35–56 years); median (IQR) pretransplant BMI was 23.0 kg/m2 (20.8–25.6 kg/m2); 1 year after transplantation, the median (IQR) BMI had increased 1.6 kg/m2 (0.3–3.2 kg/m2) and median (IQR) follow-up time was 8.3 years (5.3–12.0 years). We categorized BMI as follows: less than or equal to 20, more than 20 to less than or equal to 25 (normal), more than 25 to less than or equal to 30, and more than 30 (obesity) kg/m2. Results. Using a Cox proportional hazards model, after adjustment for cardiovascular risk factors, the relative risks (95% confidence intervals) of death and death-censored graft failure during all follow-up for pretransplant obesity compared with normal BMI were 1.22 (0.86–1.74) and 1.34 (1.02–1.77), respectively; for obesity 1 year after transplantation compared with normal BMI, it was 1.39 (1.05–1.86) and 1.39 (1.10–1.74), respectively; and for change in BMI (per 5 kg/m2 increment) during the first year after transplantation, it was 1.23 (1.01–1.50) and 1.18 (1.01–1.38), respectively. Conclusions. One year posttransplant BMI and BMI increment are more strongly related to death and graft failure than pretransplant BMI among kidney transplant recipients. Patients with BMI more than 30 kg/m2 compared with a normal BMI have approximately 20% to 40% higher risk for death and graft failure.
American Journal of Transplantation | 2004
Carla C. Baan; A.M.A. Peeters; Francine Brambate Carvalhinho Lemos; André G. Uitterlinden; Ilias I.N. Doxiadis; Frans Claas; Jan N. M. IJzermans; Joke I. Roodnat; Willem Weimar
Tissue attenuates to injury by the effects of heme oxygenase (HO)‐1. The induction of HO‐1 expression is modulated by a (GT)n dinucleotide polymorphism in the promoter of the gene, of which increased activity is associated with short (S) (≤27) repeats. We investigated the influence of this HO‐1 gene polymorphism on renal transplant survival.
The Journal of Urology | 2001
J. Herman van Roijen; Wim J. Kirkels; Robert Zietse; Joke I. Roodnat; Willem Weimar; Jan N. M. IJzermans
PURPOSE We ascertain the incidence, management and long-term outcome of early urological complications requiring surgical exploration in kidney transplantation. MATERIALS AND METHODS Data of 695 consecutive kidney transplantations performed between January 1985 and January 1997 were assessed in regard to urological complications that occurred within 1 year after transplant. A computerized database was used to analyze graft recipient characteristics, the implantation procedure, complications and outcome in select patients and all those who underwent transplant during the same period. In the noncomplication group sufficient data for evaluation was available for 556 patients. We performed the Cox proportional hazards analysis with overall graft failure, graft failure or death as end points of observation. RESULTS Overall, 42 (6.0%) patients required revision of vesicoureteral anastomosis. Complications were identified after a median of 6 days (range 0 to 135). The primary reconstruction technique was extravesical in 64% and transvesical in 33% of patients, including 1 that involved ureteral Bricker anastomosis. Obstruction and/or leakage at vesicoureteral anastomosis accounted for 69% of urological complications. Revision was performed with a number of different reconstruction techniques. A second revision was required in 16.7%. Mean followup after primary transplant was 9.1 years (range 3.2 to 15). Multivariate analysis showed that surgical treatment of urological complication during year 1 after kidney transplantation did not increase the risk of overall graft failure, graft failure or death. CONCLUSIONS Our results indicate that long-term graft survival is not affected by a surgically treated urological complication.
Transplantation | 2003
Joke I. Roodnat; Paul G.H. Mulder; I. C. van Riemsdijk; J. N.M. IJzermans; T. van Gelder; W. Weimar
Background. The results of renal transplantation are dependent on many variables. To simplify the decision process related to a kidney offer, the authors wondered which variables had the most important influence on the graft failure risk. Methods. All transplant patients (n=1,124) between January 1981 and July 2000 were included in the analysis (2.6% had missing values). The variables included were donor and recipient age and gender, recipient original disease, race, donor origin, current smoking, cardiovascular disease, body weight, peak and current panel reactive antibody (PRA), number of preceding transplants, type and duration of renal replacement therapy, and time since failure of native kidneys. Also, human leukocyte antigen (HLA) identity or not, first and second warm and cold ischemia times, left or right kidney and fossa, donor kidney anatomy, donor serum creatinine and proteinuria, and transplantation year were included. Results. In a multivariate model, cold ischemia time and its time-dependent variable significantly influenced the graft failure risk censored for death (P <0.0001) independent of any of the other risk factors. The influence primarily affected the risk in the first week after transplantation; thereafter, it gradually disappeared during the first year after transplantation. Donor serum creatinine also significantly influenced the graft failure risk in a time-dependent manner (P <0.0001). The risk of a high donor serum creatinine is already enlarged in the immediate postoperative phase and increases thereafter; the curve is closely related to the degree of the elevation. The other variables with a significant influence on the graft failure rate were, in order of decreasing significance, recipient age, donor gender, donor age, HLA identity, transplantation year, preceding transplantations, donor origin, and peak PRA. Conclusions. Donor serum creatinine and cold ischemia time are important time-dependent variables independently influencing the risk of graft failure censored for death. The best strategy for improving the results of cadaveric transplantations is to decrease the cold ischemia time and to allocate kidneys from donors with an elevated serum creatinine to low-risk recipients.
Transplantation | 2003
Joke I. Roodnat; I. C. van Riemsdijk; Paul G.H. Mulder; I. Doxiadis; Frans H.J. Claas; J. N.M. IJzermans; T. van Gelder; W. Weimar
Background. The results of living-donor (LD) renal transplantations are better than those of postmortem-donor (PMD) transplantations. To investigate whether this can be explained by a more favorable patient selection procedure in the LD population, we performed a Cox proportional hazards analysis including variables with a known influence on graft survival. Methods. All patients who underwent transplantations between January 1981 and July 2000 were included in the analysis (n=1,124, 2.6% missing values). There were 243 LD transplantations (including 30 unrelated) and 881 PMD transplantations. The other variables included were the following: donor and recipient age and gender, recipient original disease, race, current smoking habit, cardiovascular disease, body weight, peak and current panel reactive antibody, number of preceding transplants and type and duration of renal replacement therapy, and time since failure of native kidneys. In addition, the number of human leukocyte antigen identical combinations, first and second warm and cold ischemia periods, left or right kidney and fossa, donor kidney anatomy, donor serum creatinine and proteinuria, and transplantation year were included. Results. In a multivariate model, donor origin (PMD vs. LD) significantly influenced the graft failure risk censored for death independently of any of the other risk factors (P =0.0303, relative risk=1.75). There was no time interaction. When the variable cold ischemia time was excluded in the same model, the significance of the influence of donor origin on the graft failure risk increased considerably, whereas the magnitude of the influence was comparable (P =0.0004, relative risk=1.92). The influence of all other variables on the graft failure risk was unaffected when the cold ischemia period was excluded. The exclusion of none of the other variables resulted in a comparable effect. Donor origin did not influence the death risk. Conclusion. The superior results of LD versus PMD transplantations can be partly explained by the dichotomy in the cold ischemia period in these populations (selection). However, after adjustment for cold ischemia periods, the influence of donor origin still remained significant, independent of any of the variables introduced. This superiority is possibly caused by factors inherent to the transplanted organ itself, for example, the absence of brain death and cardiovascular instability of the donor before nephrectomy.