Tom Darius
Université catholique de Louvain
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tom Darius.
American Journal of Transplantation | 2012
Manuel Rodríguez-Perálvarez; G. Germani; Tom Darius; Jan Lerut; Emmanuel Tsochatzis; Andrew K. Burroughs
We hypothesized that current trough concentrations of tacrolimus after liver transplantation are set too high, considering that clinical consequences of rejection are not severe while side effects are increased. We systematically reviewed 64 studies (32 randomized controlled trials and 32 observational studies) to determine how lower tacrolimus trough concentrations than currently recommended affect acute rejection rates and renal impairment. Among randomized trials the mean of tacrolimus trough concentration during the first month was positively correlated with renal impairment within 1 year (r = 0.73; p = 0.003), but not with acute rejection, either defined using protocol biopsies (r =−0.37; p = 0.32) or not (r = 0.11; p = 0.49). A meta‐analysis of randomized trials directly comparing tacrolimus trough concentrations (five trials for acute rejection [n = 957] and two trials for renal impairment [n = 712]) showed that “reduced tacrolimus” trough concentrations (<10 ng/mL) within the first month after liver transplantation were associated with less renal impairment at 1 year (RR = 0.51 [0.38–0.69]), with no significant influence on acute rejection (RR = 0.92 [0.65–1.31]) compared to “conventional tacrolimus” trough levels (>10 ng/mL). Lower trough concentrations of tacrolimus (6–10 ng/mL during the first month) would be more appropriate after liver transplantation. Regulatory authorities and the pharmaceutical industry should allow changes of regulatory drug information.
19th annual Belgian transplantation society | 2012
Tom Darius; Diethard Monbaliu; Ina Jochmans; Nicolas Meurisse; B Desschans; Willy Coosemans; Mina Komuta; Tania Roskams; David Cassiman; Schalk Van der Merwe; Werner Van Steenbergen; Chris Verslype; Wim Laleman; Raymond Aerts; Freferik Nevens; Jacques Pirenne
BACKGROUND Wider utilization of liver grafts from donors ≥ 70 years old could substantially expand the organ pool, but their use remains limited by fear of poorer outcomes. We examined the results at our center of liver transplantation (OLT) using livers from donors ≥ 70 years old. METHODS From February 2003 to August 2010, we performed 450 OLT including 58 (13%) using donors ≥ 70 whose outcomes were compared with those using donors <70 years old. RESULTS Cerebrovascular causes of death predominated among donors ≥ 70 (85% vs 47% in donors <70; P < .001). In contrast, traumatic causes of death predominated among donors <70 (36% vs 14% in donors ≥ 70; P = .002). Unlike grafts from donors <70 years old, grafts from older individuals had no additional risk factors (steatosis, high sodium, or hemodynamic instability). Both groups were comparable for cold and warm ischemia times. No difference was noted in posttransplant peak transaminases, incidence of primary nonfunction, hepatic artery thrombosis, biliary strictures, or retransplantation rates between groups. The 1- and 5-year patient survivals were 88% and 82% in recipients of livers <70 versus 90% and 84% in those from ≥ 70 years old (P = .705). Recipients of older grafts, who were 6 years older than recipients of younger grafts (P < .001), tended to have a lower laboratory Model for End-Stage Liver Disease score (P = .074). CONCLUSIONS Short and mid-term survival following OLT using donors ≥ 70 yo can be excellent provided that there is adequate donor and recipient selection. Septuagenarians and octogenarians with cerebrovascular ischemic and bleeding accidents represent a large pool of potential donors whose wider use could substantially reduce mortality on the OLT waiting list.
Liver Transplantation | 2014
Tom Darius; Jairo Rivera; Fabio Fusaro; Quirino Lai; Catherine De Magnee; Christophe Bourdeaux; Magdalena Janssen; Philippe Clapuyt; Raymond Reding
Biliary complications (BCs) still remain the Achilles heel of liver transplantation (LT) with an overall incidence of 10% to 35% in pediatric series. We hypothesized that (1) the use of alternative techniques (reduced size, split, and living donor grafts) in pediatric LT may contribute to an increased incidence of BCs, and (2) surgery as a first treatment option for anastomotic BCs could allow a definitive cure for the majority of these patients. Four hundred twenty‐nine primary pediatric LT procedures, including 88, 91, 47, and 203 whole, reduced size, split, and living donor grafts, respectively, that were performed between July 1993 and November 2010 were retrospectively reviewed. Demographic and surgical variables were analyzed, and their respective impact on BCs was studied with univariate and multivariate analyses. The modalities of BC management were also reviewed. The 1‐ and 5‐year patient survival rates were 94% and 90%, 89% and 85%, 94% and 89%, and 98% and 94% for whole, reduced size, split, and living donor liver grafts, respectively. The overall incidence of BCs was 23% (n = 98). Sixty were anastomotic complications [47 strictures (78%) and 13 fistulas (22%)]. The graft type was not found to be an independent risk factor for the development of BCs. According to a multivariate analysis, only hepatic artery thrombosis and acute rejection increased the risk of anastomotic BCs (P < 0.001 and P = 0.003, respectively). Anastomotic BCs were managed primarily with surgical repair in 59 of 60 cases with a primary patency rate of 80% (n = 47). These results suggest that (1) most of the BCs were anastomotic complications not influenced by the type of graft, and (2) the surgical management of anastomotic BCs may constitute the first and best therapeutic option. Liver Transpl 20:893‐903, 2014.
Clinical Biochemistry | 2014
Antoine Buemi; Flora Musuamba Tshinanu; Stephan Frederic; Anne Douhet; Martine De Meyer; Luc De Pauw; Tom Darius; Nada Kanaan; Pierre Wallemacq; Michel Mourad
OBJECTIVES Delayed graft function (DGF) is still a major issue in kidney transplantation. Plasma and urine neutrophil gelatinase-associated lipocalin (NGAL) were evaluated in a population of kidney donors and recipients to investigate their performance to predict early renal function. DESIGN AND METHODS Plasma (pNGAL) and urine (uNGAL) samples were obtained from donors before organ procurement, and from recipients before transplantation, and then 6, 24 and 48h after the procedure. Kidney transplantations were performed from both living donors (LDs, n=17) and deceased donors (DDs, n=80). Recovery of renal function was evaluated as the time to reach serum creatinine <2mg/l or glomerular filtration rate (GFR)>40mL/min. Logistic regression was used to assess the ability of different variables to predict the occurrence of DGF. RESULTS Plasma NGAL levels were significantly lower in LDs than in DDs. No episodes of DGF were recorded among LD kidney recipients, but DGF was observed in 25% of patients in the DD group. There was no correlation between donor pNGAL and uNGAL values and the occurrence of post-transplant DGF. Recipient pNGAL performed better than uNGAL in terms of predicting DGF occurrence. Donor pNGAL and uNGAL values did not influence the time needed to reach serum creatinine levels of <2mg/dl after transplantation. When time to reach eGFR of >40mL/min is considered, only donor uNGAL seems to be a predictor of graft function recovery. However, recipient pNGAL values obtained 24 and 48h after transplantation, but not uNGAL values, were found to be a significant predictor of graft function recovery. CONCLUSIONS Plasma NGAL level determination in recipients, but not in donors, proved to be a reliable predictor of DGF occurrence and renal function restoration, but too long for an interval to be able to compete with biomarkers currently used in clinical practice.
Transplant International | 2012
Ina Jochmans; Tom Darius; Dirk Kuypers; Diethard Monbaliu; Eric Goffin; Michel Mourad; Hieu Ledinh; Laurent Weekers; Patrick Peeters; Caren Randon; Jean-Louis Bosmans; Geert Roeyen; Daniel Abramowicz; Anh Dung Hoang; Luc De Pauw; Axel Rahmel; Jean-Paul Squifflet; Jacques Pirenne
Worldwide shortage of standard brain dead donors (DBD) has revived the use of kidneys donated after circulatory death (DCD). We reviewed the Belgian DCD kidney transplant (KT) experience since its reintroduction in 2000. Risk factors for delayed graft function (DGF) were identified using multivariate analysis. Five‐year patient/graft survival was assessed using Kaplan–Meier curves. The evolution of the kidney donor type and the impact of DCDs on the total KT activity in Belgium were compared with the Netherlands. Between 2000 and 2009, 287 DCD KT were performed. Primary nonfunction occurred in 1% and DGF in 31%. Five‐year patient and death‐censored graft survival were 93% and 95%, respectively. In multivariate analysis, cold storage (versus machine perfusion), cold ischemic time, and histidine‐tryptophan‐ketoglutarate solution were independent risk factors for the development of DGF. Despite an increased number of DCD donations and transplantations, the total number of deceased KT did not increase significantly. This could suggest a shift from DBDs to DCDs. To increase KT activity, Belgium should further expand controlled DCD programs while simultaneously improve the identification of all potential DBDs and avoid their referral for donation as DCDs before brain death occurs. Furthermore, living donation remains underused.
Transplantation | 2010
Kathleen Claes; Bert Bammens; Pieter Evenepoel; Dirk Kuypers; Willy Coosemans; Tom Darius; Diethard Monbaliu; Jacques Pirenne; Yves Vanrenterghem
Background. Patients on the renal transplant waiting list and renal transplant recipients have an increased risk of premature cardiovascular (CV) disease and death. Methods. We performed a prospective observational study in 331 kidney or kidney-pancreas transplant recipients to test whether Troponin I (TnI), determined at time of engraftment, can help to identify patients at risk for a major adverse cardiac event (MACE) in the immediate postoperative period. Logistic regression analysis was used to test if pretransplant TnI is a predictor of MACE within 3 months after transplantation. Results. Eleven patients (3.3%) developed a MACE during the first 2 weeks after transplantation. In patients with a CV history (23.6%), the incidence of MACE increased to 13.4%. In univariate analysis, age (odds ratio [OR] 1.062, P=0.04), TnI (OR 1.12, P=0.0042), HbA1c (OR 1.879, P=0.0076), and CV history (absent vs. present OR 0.027, P=0.0006) were associated with MACE. TnI remained an independent predictor after adjusting for every other significant variable. When we restricted the analysis to patients with a CV history, TnI was the only statistically significant variable associated with MACE. Conclusion. Elevated TnI, immediately pretransplant, is an independent predictor of MACE in the immediate posttransplant period, particularly in patients with CV history.
Transplantation Proceedings | 2014
F. Dupriez; L. De Pauw; Tom Darius; Michel Mourad; A. Penaloza; D. Van Deynse; C. Baltus; Franck Verschuren
BACKGROUND Since 1999, a protocol for uncontrolled donation after cardio-circulatory death (DCD) has been carried out in our institution. We aimed at evaluating those 14 years of local experience. METHODS We reviewed the charts of uncontrolled donors from 1999 till 2013. Potential donors with a no-flow period less than 30 minutes were considered. Kidneys were perfused by the use of a double balloon triple lumen catheter after at least a 2-minute period of no touch. We analyzed grafts outcome and warm and cold ischemia times. RESULTS Thirty-nine procedures were initiated: 19 were aborted because of family refusal (n = 7), medical reasons (n = 7), or canulation failures (n = 5) and 20 harvesting procedures were completed. Transplantation was considered for 35 kidneys (cold storage [n = 5] and hypothermic preservation system [n = 30]). The causes of withdrawal from transplantation were mostly macroscopic lesions (poor perfusion, macroscopic parenchyma or vascular lesions, or infectious risk). We transplanted 22 kidneys locally and 3 were shipped to another Eurotransplant center. Mean donor age was 40 ± 13 years. Among the 20 donors, 13 came from the emergency unit and 7 from the intensive care unit. Mean no-flow time for out-hospital management was 8.7 ± 3.6 minutes. Mean time of cardiopulmonary resuscitation was 71 ± 46 minutes. Mean cold ischemia time was 19 ± 5 hours. Primary nonfunction and delayed graft function occurred in 1 and 12 cases (4.5% and 54%), respectively. Graft survival was 86% at 1 year. Causes of graft loss during the entire follow-up were graft rejection (n = 3), ischemically damaged kidney (n = 2), and recurrence of focal segmental glomerulosclerosis (n = 1). CONCLUSION In our experience, uncontrolled donors represent a valuable source of kidney grafts, with a prognosis of graft function and survival similar to the literature. To increase the number of available DCD organs, new techniques, such as the use of Normothermic ExtraCorporeal Membrane Oxygenation (NECMO), as well as improvement of recruitment of out of hospital potential donors have to be considered.
Transplantation Proceedings | 2010
Daan Dierickx; Diethard Monbaliu; A. De Rycke; E. Wisanto; Evelyne Lerut; Timothy Devos; S. Meers; Tom Darius; Patrick Ferdinande; Jacques Pirenne
BACKGROUND Transplant-related thrombotic microangiopathy (TMA) is a well-recognized complication of all types of transplantations. Despite its known relationship with immunosuppressive therapy, only a few cases have been reported following intestinal transplantation. METHODS We retrospectively reviewed the medical files of nine consecutive intestinal transplant patients between 2000 and 2008. RESULTS The diagnosis of TMA was established in 3 patients (33%). At diagnosis the immunosuppressive therapy consisted of tacrolimus (n = 3), combined with azathioprine (n = 1) or sirolimus (n = 2) and steroids (n = 2). The median time between transplantation and TMA was 104 days (range, 55-167 days). Levels of ADAMTS13, a von Willebrand protease, were within normal ranges in all 3 patients. Treatment consisted of stopping/tapering of tacrolimus, together with initiation of plasma therapy, leading to complete remission in all 3 patients. During further follow-up, all 3 patients showed severe graft rejection necessitating more profound immunosuppressive therapy, leading to graft loss in 1 patient and infection-related death in the 2 others. At a median follow-up of 52 months (range, 9-100 months) all remaining TMA-free patients (n = 6) were alive with functioning grafts under minimal immunosuppression. CONCLUSION Herein we have described 3 intestinal transplant patients who were diagnosed with transplantation-related TMA. Despite excellent disease control the final outcomes were dismal, which clearly contrasts with the outcome among TMA-free patients, who were all well with functioning grafts at last follow-up.
American Journal of Transplantation | 2013
Manuel Rodríguez-Perálvarez; G. Germani; Tom Darius; Jan Lerut; Emmanuel Tsochatzis; M. de la Mata; Andrew K. Burroughs
We read with great interest the randomized controlled trial (RCT) by De Simone et al. (1), which evaluated the combination of tacrolimus and everolimus after liver transplantation (LT). They demonstrated that glomerular filtration rate decreased less at 12 months with everolimus and reduced tacrolimus, compared to a conventional tacrolimus based regimen, while rejection rates were similar. Although these results are promising we are very concerned about what is meant by ‘‘conventional exposure to tacrolimus’’ in RCT, and its potential implications on clinical practice and patients’ safety.
American Journal of Transplantation | 2013
Manuel Rodríguez-Perálvarez; G. Germani; Tom Darius; Jan Lerut; Emmanuel Tsochatzis; Andrew K. Burroughs
The recently published PROTECT trial (1) evaluated the renal sparing effect of using everolimus and tapering calcineurin inhibitors (CNI) in liver transplantation (LT). After 1 month, 54% patients who met the randomization criteria (among these rejection free for at least 2 weeks and glomerular filtration rate >50 mL/min) were randomized as follows: (a) an experimental arm in which everolimus was started, and CNIs dose was progressively reduced for 8 weeks and then stopped; (b) a control arm in which CNI dosage was maintained. The primary endpoint was renal function at 12 months from LT. However the benefit of using everolimus and discontinuing CNIs was very limited in terms of glomerular filtration rate. The authors concluded that renal sparing strategies using everolimus deserve further investigation in LT patients.