Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Johan W. de Fijter is active.

Publication


Featured researches published by Johan W. de Fijter.


European Journal of Immunology | 2000

The effect of calcineurin inhibitors and corticosteroids on the differentiation of human dendritic cells

Johan W. de Fijter; Sylvia W.A. Kamerling; Leendert C. Paul; Mohamed R. Daha; Cees van Kooten

Corticosteroids and the calcineurin inhibitors cyclosporin A (CsA) and FK506 have been studied extensively regarding their effects on T lymphocytes, but their effects on dendritic cells (DC) are relatively unknown. Monocytes are one of the precursors of DC that differentiate into CD14–CD1a+ immature DC upon culture with IL‐4 and GM‐CSF. The presence of CsA or FK506 during differentiation did not affect DC development. In contrast, the presence of corticosteroids, either dexamethasone (Dex) or prednisolone (Pred), for as little as the first 48 h of culture blocked the generation of immature DC. Dex‐DC were unresponsive to signals inducing maturation (CD40 ligand, lipopolysaccharide), as demonstrated by the absence of CD83, CD80/CD86 and HLA‐DR up‐regulation and their strongly reduced T cell stimulatory capacity. Furthermore, Dex‐DC showed a decreased CD40 ligand‐induced IL‐6 and TNF‐α production, a complete block in IL‐12p40 production, while IL‐10 production was unaffected. CsA‐DC and FK506‐DC showed a partial reduction in the production of TNF‐α, whereas all other functional activities appeared to be similar to control DC. These data show that, when compared to calcineurin inhibitors, corticosteroids have a unique and profound inhibitory effect on the generation and function of DC.


The Lancet | 1999

Effect of simultaneous pancreas-kidney transplantation on mortality of patients with type-1 diabetes mellitus and end-stage renal failure

Y.F.C. Smets; Rudi G. J. Westendorp; Johan W van der Pijl; Frank Th de Charro; Jan Ringers; Johan W. de Fijter; H. H. P. J. Lemkes

BACKGROUND Long-term prognosis of patients with type-1 diabetes mellitus and end-stage renal failure appears to be better after kidney transplantation compared with dialysis. Controversy exists about the additional benefit of a simultaneously transplanted pancreatic graft. We studied the effect on mortality of simultaneous pancreas-kidney transplantation compared with kidney transplantation alone from regional differences in transplantation protocols. METHODS All 415 patients with type-1 diabetes (aged 18-52 years) who started renal-replacement therapy in the Netherlands between 1985 and 1996 were included in the analysis. Patients were allocated to a centre based on their place of residence at onset of renal failure. In the Leiden area, the primary intention to treat was with a simultaneous pancreas-kidney transplantation, whereas in the non-Leiden area, kidney transplantation alone was the predominant type of treatment. All patients were followed up to July, 1997. Analyses, mortality, and graft failure were by Cox proportional-hazard model adjusted for age and sex. FINDINGS Simultaneous pancreas-kidney transplantation was done in 41 (73%) of 56 transplanted patients in the Leiden area compared with 59 (37%) of 158 transplanted patients in the non-Leiden area (p<0.001). The hazard ratio for mortality after the start of renal-replacement therapy was 0.53 (95% CI, 0.36-0.77, p<0.001) in the Leiden area compared with the non-Leiden area. When just the transplanted patients were analysed the mortality ratio was 0.4 (95% CI 0.20-0.77, p=0.008) and was independent of duration of dialysis and early transplant-related deaths. Equal survival was found for patients on dialysis only. INTERPRETATION These data support the hypothesis that simultaneous pancreas-kidney transplantation prolongs survival in patients with diabetes and end-stage renal failure.


Transplantation | 2008

Comparing Mycophenolate Mofetil Regimens for de Novo Renal Transplant Recipients: The Fixed-Dose Concentration-Controlled Trial

Teun van Gelder; Helio Tedesco Silva; Johan W. de Fijter; Klemens Budde; Dirk Kuypers; Gunnar Tydén; Aleksander Lõhmus; Claudia Sommerer; Anders Hartmann; Yann Le Meur; Michael Oellerich; David W. Holt; Burkhard Tönshoff; Paul Keown; Scott B. Campbell; Richard D. Mamelok

Background. Fixed-dose mycophenolate mofetil (MMF) reduces the incidence of acute rejection after solid organ transplantation. The Fixed-Dose Concentration Controlled trial assessed the feasibility and potential benefit of therapeutic drug monitoring in patients receiving MMF after de novo renal transplant. Methods. Patients were randomized to a concentration-controlled (n=452; target exposure 45 mg hr/L) or a fixed-dose (n=449) MMF-containing regimen. The primary endpoint was treatment failure (a composite of biopsy-proven acute rejection [BPAR], graft loss, death, or MMF discontinuation) by 12 months posttransplantation. Results. Mycophenolic acid (MPA) exposures for both groups were similar at most time points and were below 30 mg hr/L in 37.3% of patients at day 3. There was no difference in the incidence of treatment failure (25.6% vs. 25.7%, P=0.81) or BPAR (14.9% vs. 15.5%, P>0.05) between the concentration-controlled and the fixed-dose groups, respectively. We did find a significant relationship between MPA-area under the concentration–time curve on day 3 and the incidence of BPAR in the first month (P=0.009) or in the first year posttransplantation (P=0.006). For later time points (day 10, month 1) there was no significant relationship between area under the concentration–time curve and BPAR (0.2572 and 0.5588, respectively). Conclusions. There was no difference in the incidence of treatment failure between the concentration-controlled and the fixed-dose groups. The applied protocol of MMF dose adjustments based on target MPA exposure was not successful, partly because physicians seemed reluctant to implement substantial dose changes. Current initial MMF doses underexpose more than 35% of patients early after transplantation, increasing the risk for BPAR.


Stem Cells Translational Medicine | 2013

Autologous Bone Marrow-Derived Mesenchymal Stromal Cells for the Treatment of Allograft Rejection After Renal Transplantation: Results of a Phase I Study

Marlies E.J. Reinders; Johan W. de Fijter; Helene Roelofs; Ingeborg M. Bajema; Dorottya K. de Vries; Alexander F. Schaapherder; Frans H.J. Claas; Paula P.M.C. van Miert; Dave L. Roelen; Cees van Kooten; Willem E. Fibbe; Ton J. Rabelink

Despite excellent short‐term results, long‐term survival of transplanted kidneys has not improved accordingly. Although alloimmune responses and calcineurin inhibitor‐related nephrotoxicity have been identified as main drivers of fibrosis, no effective treatment options have emerged. In this perspective, mesenchymal stromal cells (MSCs) are an interesting candidate because of their immunosuppressive and regenerative properties. Of importance, no other clinical studies have investigated their effects in allograft rejection and fibrosis. We performed a safety and feasibility study in kidney allograft recipients to whom two intravenous infusions (1 million cells per kilogram) of autologous bone marrow (BM) MSCs were given, when a protocol renal biopsy at 4 weeks or 6 months showed signs of rejection and/or an increase in interstitial fibrosis/tubular atrophy (IF/TA). Six patients received MSC infusions. Clinical and immune monitoring was performed up to 24 weeks after MSC infusions. MSCs fulfilled the release criteria, infusions were well‐tolerated, and no treatment‐related serious adverse events were reported. In two recipients with allograft rejection, we had a clinical indication to perform surveillance biopsies and are able to report on the potential effects of MSCs in rejection. Although maintenance immunosuppression remained unaltered, there was a resolution of tubulitis without IF/TA in both patients. Additionally, three patients developed an opportunistic viral infection, and five of the six patients displayed a donor‐specific downregulation of the peripheral blood mononuclear cell proliferation assay, not reported in patients without MSC treatment. Autologous BM MSC treatment in transplant recipients with subclinical rejection and IF/TA is clinically feasible and safe, and the findings are suggestive of systemic immunosuppression.


Transplantation | 2003

Early versus late acute rejection episodes in renal transplantation.

Yvo W.J. Sijpkens; Ilias I.N. Doxiadis; Marko J.K. Mallat; Johan W. de Fijter; Jan A. Bruijn; Frans H.J. Claas; Leendert C. Paul

Background. Acute rejection is a major complication after renal transplantation and the most important risk factor for chronic rejection. We investigated whether the timing of the last treated acute rejection episode (ARE) influences long-term outcome and compared the risk profiles of early versus late ARE. Methods. A cohort of 654 patients who underwent cadaveric renal transplants (1983–1997) that functioned for more than 6 months was studied. In 384 of 654 transplant recipients, one or more treated AREs were documented; the last ARE occurred in 297 of 384 transplant recipients within 3 months and in 87 of 384 after 3 months. Applying multivariate logistic regression analysis, we compared the predictor variables of the two groups with transplants without AREs. Results. Ten-year graft survival rates censored for causes of graft loss other than chronic rejection were 94%, 86%, and 45% for patients without ARE, with early ARE, and with late ARE, respectively. Delayed graft function, odds ratio (OR) 2.37 (1.55–3.62), and major histocompatibility complex (MHC) class II incompatibility, OR 2.28 (1.62–3.20) per human leukocyte antigen (HLA)-DR mismatch, were independent risk factors for early ARE. In contrast, recipient age, OR 0.75 (0.61–0.93) per 10-year increase, donor age, OR 1.28 (1.07–1.53) per 10-year increase, female donor gender, OR 1.74 (1.03–2.94), and MHC class I incompatibility, OR 1.35 (1.07–1.72) per mismatch of cross reactive groups, were associated with late ARE. Conclusions. Late ARE has a detrimental impact on long-term graft survival and is associated with MHC class I incompatibility, whereas early ARE is correlated with HLA-DR mismatches and has a better prognosis. These data are consistent with the role of direct and indirect allorecognition in the pathophysiology of early and late ARE, respectively.


American Journal of Transplantation | 2005

Association Between Mannose‐Binding Lectin Levels and Graft Survival in Kidney Transplantation

Stefan P. Berger; Anja Roos; Marko J.K. Mallat; Teizo Fujita; Johan W. de Fijter; Mohamed R. Daha

The mannose‐binding lectin (MBL) pathway of complement is activated by pattern recognition. Genetic MBL variants are frequent and associated with low MBL serum levels. Higher MBL levels may be associated with more complement‐mediated damage resulting in inferior graft survival.


Therapeutic Drug Monitoring | 2009

Explaining Variability in Tacrolimus Pharmacokinetics to Optimize Early Exposure in Adult Kidney Transplant Recipients

Rogier R. Press; Bart A. Ploeger; Jan den Hartigh; Tahar van der Straaten; Johannes van Pelt; Meindert Danhof; Johan W. de Fijter; Henk-Jan Guchelaar

To prevent acute rejection episodes, it is important to reach adequate tacrolimus (TRL) exposure early after kidney transplantation. With a better understanding of the high variability in the pharmacokinetics of TRL, the starting dose can be individualized, resulting in a reduction in dose adjustments to obtain the target exposure. A population pharmacokinetic analysis was performed to estimate the effects of demographic factors, hematocrit, serum albumin concentration, prednisolone dose, TRL dose interval, polymorphisms in genes coding for ABCB1, CYP3A5, CYP3A4, and the pregnane X receptor on TRL pharmacokinetics. Pharmacokinetic data were prospectively obtained in 31 de novo kidney transplant patients randomized to receive TRL once or twice daily, and subsequently, the data were analyzed by means of nonlinear mixed-effects modeling. TRL clearance was 1.5-fold higher for patients with the CYP3A5*1/*3 genotype compared with the CYP3A5*3/*3 genotype (5.5 ± 0.5 L/h versus 3.7 ± 0.3 L/h, respectively). This factor explained 30% of the interindividual variability in apparent clearance (exposure). Also, a relationship between the pregnane X receptor A+7635G genotype and TRL clearance was identified with a clearance of 3.9 ± 0.3 L/h in the A allele carriers versus 5.4 ± 0.6 L/h in the GG genotype. Finally, a concomitant prednisolone dose of more than 10 mg/d increased the TRL apparent clearance by 15%. In contrast, body weight was not related to TRL clearance in this population. Because patients are typically dosed per kilogram body weight, this might result in underexposure and overexposure in patients, with a low and high body weight, respectively. This integrated analysis shows that adult renal transplant recipients with the CYP3A5*1/*3 genotype require a 1.5 times higher, fixed, starting dose compared with CYP3A5*3/*3 to reach the predefined target exposure early after transplantation.


Transplantation | 2000

The prevalence of human papillomavirus DNA in benign keratotic skin lesions of renal transplant recipients with and without a history of skin cancer is equally high: a clinical study to assess risk factors for keratotic skin lesions and skin cancer.

Linda M. de Jong-Tieben; Ron J. M. Berkhout; Jan ter Schegget; Bert Jan Vermeer; Johan W. de Fijter; Jan A. Bruijn; Rudi G. J. Westendorp; Jan Nico Bouwes Bavinck

DNA of the epidermodysplasia-verruciformis associated subgroup of HPV (EV-HPV) is frequently detected in biopsies of premalignant lesions and nonmelanoma skin cancers of renal transplant recipients. The prevalence of EV-HPVs, however, has never been systematically studied in benign keratotic skin lesions of patients with or without a history of skin cancer. This study included 42 renal transplant recipients with and 36 without a history of skin cancer. A total of 176 skin biopsies were tested for the presence of EV-HPV DNA, using a nested polymerase chain reaction (PCR). Method. EV-HPV typing was done by comparison of the sequence of the amplified PCR products with the sequence of all known EV-HPVs. The natural history of the development of keratotic skin lesions was studied. The number of keratotic skin lesions rapidly increased after transplantation. This increase was most pronounced in patients who developed skin cancer. The prevalence of EV-HPV DNA in benign keratotic skin lesions was equally high in patients with and without a history of skin cancer, i.e., 55 and 53% in the two groups, respectively. A large variety of EV-HPV types was found, but of these none were predominantly present in either patient groups. A higher prevalence of EV-HPV DNA was found in benign skin lesions from sun-exposed sites, but only in patients with a history of skin cancer. The association between the number of keratotic skin lesions and the development of skin cancer strongly supports the hypothesis that EV-HPVs play a role in cutaneous oncogenesis. The equally high prevalence of EV-HPV infection in patients with and without a history of skin cancer, however, may indicate that besides EV-HPV infection, other factors, such as sun exposure may also be important. Renal transplant recipients are at an increased risk for warts and nonmelanoma skin cancer, of which squamous cell carcinomas are the most prevalent (1‐ 6). The prevalence and number of warts increase steadily after transplantation (4, 7‐10), and these lesions precede the development of skin cancer by some years (11). The number of keratotic skin lesions is strongly associated with the development of skin cancer (3, 11). DNA of human papillomaviruses (HPV*), which are mainly belonging to the epidermodysplasia-verruciformis- (EV) related subgroup, have been frequently detected in skin cancers and premalignant skin lesions of renal transplant recipients (12‐23). The prevalence of EV-HPVs in benign keratotic skin lesions of renal transplant recipients with and without a history of skin cancer has not been studied before. To address this question, we took biopsies of 176 clinically apparently benign keratotic skin lesions and normal skin in 42 patients with and 36 without a history of skin cancer. Keratotic skin lesions were counted in a group of 66 patients of whom keratotic skin lesions had been counted 7 years previously (11). The relationship between clinical and histological diagnoses was examined in a large number of skin lesions. In addition, the association of age, gender, length of time after transplantation, sun exposure, and skin type with the number of keratotic skin lesions, the presence of HPV DNA in these lesions, and skin cancer was studied.UNLABELLED DNA of the epidermodysplasia-verruciformis associated subgroup of HPV (EV-HPV) is frequently detected in biopsies of premalignant lesions and nonmelanoma skin cancers of renal transplant recipients. The prevalence of EV-HPVs, however, has never been systematically studied in benign keratotic skin lesions of patients with or without a history of skin cancer. This study included 42 renal transplant recipients with and 36 without a history of skin cancer. A total of 176 skin biopsies were tested for the presence of EV-HPV DNA, using a nested polymerase chain reaction (PCR). METHOD EV-HPV typing was done by comparison of the sequence of the amplified PCR products with the sequence of all known EV-HPVs. The natural history of the development of keratotic skin lesions was studied. The number of keratotic skin lesions rapidly increased after transplantation. This increase was most pronounced in patients who developed skin cancer. The prevalence of EV-HPV DNA in benign keratotic skin lesions was equally high in patients with and without a history of skin cancer, i.e., 55 and 53% in the two groups, respectively. A large variety of EV-HPV types was found, but of these none were predominantly present in either patient groups. A higher prevalence of EV-HPV DNA was found in benign skin lesions from sun-exposed sites, but only in patients with a history of skin cancer. The association between the number of keratotic skin lesions and the development of skin cancer strongly supports the hypothesis that EV-HPVs play a role in cutaneous oncogenesis. The equally high prevalence of EV-HPV infection in patients with and without a history of skin cancer, however, may indicate that besides EV-HPV infection, other factors, such as sun exposure may also be important.


Transplantation | 2005

Non-heart-beating donor kidneys in the Netherlands: allocation and outcome of transplantation.

Karin M. Keizer; Johan W. de Fijter; Bernadette J. J. M. Haase-Kromwijk; Willem Weimar

Background. Since February 1, 2001, kidneys from both heart-beating (HB) and non–heart-beating (NHB) donors in The Netherlands have been indiscriminately allocated through the standard renal-allocation system. Methods. Renal function and allograft-survival rate for kidneys from NHB and HB donors were compared at 3 and 12 months. Results. The outcomes of 276 renal transplants, 176 from HB donors and 100 from NHB III donors, allocated through the standard renal allocation system, Eurotransplant Kidney Allocation System, and performed between February 1, 2001 and March 1, 2002 were compared. Three months after transplantation, graft survival was 93.7% for HB kidneys and 85.0% for NHB kidneys (P<0.05). At 12 months, graft survival was 92.0% and 83.0%, respectively (P<0.03). Serum creatinine levels in the two groups were comparable at both 3 and 12 months. Multivariate analysis identified previous kidney transplantation (relative risk [RR] 3.33; P<0.005), donor creatinine (RR 1.01; P<0.005), and NHB (RR 2.38; P<0.05) as independent risk factors for transplant failure within 12 months. In multivariate analysis of NHB data, a warm ischemia time (WIT) of 30 minutes or longer (P<0.005; RR 6.16, 95% confidence interval 2.11–18.00) was associated with early graft failure. No difference in 12-month graft survival was seen between HB and NHB kidneys after excluding the kidneys that failed in the first 3 months. Conclusion. Early graft failure was significantly more likely in recipients of kidneys from NHB donors. A prolonged WIT was strongly associated with this failure. Standard allocation procedures do not have a negative effect on outcome, and there is no reason to allocate NHB kidneys differently from HB kidneys.


Journal of Clinical Oncology | 2013

Two-Year Randomized Controlled Prospective Trial Converting Treatment of Stable Renal Transplant Recipients With Cutaneous Invasive Squamous Cell Carcinomas to Sirolimus

Judith M. Hoogendijk-van den Akker; Paul N. Harden; Andries J. Hoitsma; Charlotte M. Proby; Ron Wolterbeek; Jan Nico Bouwes Bavinck; Johan W. de Fijter

PURPOSE In light of the significant morbidity and mortality of cutaneous invasive squamous cell carcinomas (SCCs) in renal transplant recipients, we investigated whether conversion to sirolimus-based immunosuppression from standard immunosuppression could diminish the recurrence rate of these skin cancers. PATIENTS AND METHODS In a 2-year randomized controlled trial, 155 renal transplant recipients with at least one biopsy-confirmed SCC were stratified according to age (< 55 v ≥ 55 years) and number of previous SCCs (one to nine v ≥ 10) and randomly assigned to conversion to sirolimus (n = 74) or continuation of their original immunosuppression (n = 81). Development of a new SCC within 2 years after random assignment was the primary end point. RESULTS After 2 years of follow-up, the risk reduction of new SCCs in the multivariable analysis was not significant, with a hazard ratio (HR) of 0.76 (95% CI, 0.48 to 1.2; P = .255), compared with a non-sirolimus-based regimen. After the first year, there was a significant 50% risk reduction, with an HR of 0.50 (95% CI, 0.28 to 0.90; P = .021) for all patients together and an HR of 0.11 (95% CI, 0.01 to 0.94; P = .044) for patients with only one previous SCC. The tumor burden of SCC was reduced during the 2-year follow-up period in those receiving sirolimus (0.82 v 1.38 per year; HR, 0.51; 95% CI, 0.32 to 0.82; P = .006) if adjusted for the number of previous SCCs and age. Twenty-nine patients stopped taking sirolimus because of various adverse events. CONCLUSION Conversion to sirolimus-based immunosuppression failed to show a benefit in terms of SCC-free survival at 2 years.

Collaboration


Dive into the Johan W. de Fijter's collaboration.

Top Co-Authors

Avatar

Cees van Kooten

Leiden University Medical Center

View shared research outputs
Top Co-Authors

Avatar

Marko J.K. Mallat

Leiden University Medical Center

View shared research outputs
Top Co-Authors

Avatar

Frans H.J. Claas

Leiden University Medical Center

View shared research outputs
Top Co-Authors

Avatar

Leendert C. Paul

Leiden University Medical Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jan Ringers

Leiden University Medical Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ton J. Rabelink

Leiden University Medical Center

View shared research outputs
Top Co-Authors

Avatar

Paul J.M. van der Boog

Leiden University Medical Center

View shared research outputs
Top Co-Authors

Avatar

Ingeborg M. Bajema

Leiden University Medical Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge