Marc Clancy
Western Infirmary
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Marc Clancy.
Experimental Physiology | 2007
Damian Marshall; Mark Dilworth; Marc Clancy; Christopher A. Bravery; Nick Ashton
Renal failure and end‐stage renal disease are prevalent diseases associated with high levels of morbidity and mortality, the preferred treatment for which is kidney transplantation. However, the gulf between supply and demand for kidneys remains high and is growing every year. A potential alternative to the transplantation of mature adult kidneys is the transplantation of the developing renal primordium, the metanephros. It has been shown previously, in rodent models, that transplantation of a metanephros can provide renal function capable of prolonging survival in anephric animals. The aim of the present study was to determine whether increasing the mass of transplanted tissue can prolong survival further. Embryonic day 15 rat metanephroi were transplanted into the peritoneum of anaesthetized adult rat recipients. Twenty‐one days later, the transplanted metanephroi were anastomosed to the recipients urinary system, and 35 days following anastomosis the animals native renal mass was removed. Survival times and composition of the excreted fluid were determined. Rats with single metanephros transplants survived 29 h longer than anephric controls (P < 0.001); animals with two metanephroi survived 44 h longer (P < 0.001). A dilute urine was formed, with low concentrations of sodium, potassium and urea; potassium and urea concentrations were elevated in terminal serum samples, but sodium concentration and osmolality were comparable to control values. These data show that survival time is proportional to the mass of functional renal tissue. While transplanted metanephroi cannot currently provide life‐sustaining renal function, this approach may have therapeutic benefit in the future.
PLOS ONE | 2013
Marc Gingell-Littlejohn; Dagmara McGuinness; Liane McGlynn; David Kingsmore; Karen Stevenson; Christian Koppelstaetter; Marc Clancy; Paul G. Shiels
CDKN2A is a proven and validated biomarker of ageing which acts as an off switch for cell proliferation. We have demonstrated previously that CDKN2A is the most robust and the strongest pre-transplant predictor of post- transplant serum creatinine when compared to “Gold Standard” clinical factors, such as cold ischaemic time and donor chronological age. This report shows that CDKN2A is better than telomere length, the most celebrated biomarker of ageing, as a predictor of post-transplant renal function. It also shows that CDKN2A is as strong a determinant of post-transplant organ function when compared to extended criteria (ECD) kidneys. A multivariate analysis model was able to predict up to 27.1% of eGFR at one year post-transplant (p = 0.008). Significantly, CDKN2A was also able to strongly predict delayed graft function. A pre-transplant donor risk classification system based on CDKN2A and ECD criteria is shown to be feasible and commendable for implementation in the near future.
The Lancet | 2016
Emma Aitken; Andrew M. Jackson; Rachel Kearns; Mark Steven; John Kinsella; Marc Clancy; Alan J. R. Macfarlane
BACKGROUND Arteriovenous fistulae are the optimum form of vascular access in end-stage renal failure. However, they have a high early failure rate. Regional compared with local anaesthesia results in greater vasodilatation and increases short-term blood flow. This study investigated whether regional compared with local anaesthesia improved medium-term arteriovenous fistula patency. METHODS This observer-blinded, randomised controlled trial was done at three university hospitals in Glasgow, UK. Adults undergoing primary radiocephalic or brachiocephalic arteriovenous fistula creation were randomly assigned (1:1; in blocks of eight) using a computer-generated allocation system to receive either local anaesthesia (0·5% L-bupivacaine and 1% lidocaine injected subcutaneously) or regional (brachial plexus block [BPB]) anaesthesia (0·5% L-bupivacaine and 1·5% lidocaine with epinephrine). Patients were excluded if they were coagulopathic, had no suitable vessels, or had a previous failed ipsilateral fistula. The primary endpoint was arteriovenous fistula patency at 3 months. We analysed the data on an intention-to-treat basis. This study was registered with ClinicalTrials.gov (NCT01706354) and is complete. FINDINGS Between Feb 6, 2013, and Dec 4, 2015, 163 patients were assessed for eligibility and 126 patients were randomly assigned to local anaesthesia (n=63) or BPB (n=63). All patients completed follow-up on an intention-to-treat basis. Primary patency at 3 months was higher in the BPB group than the local anaesthesia group (53 [84%] of 63 patients vs 39 [62%] of 63; odds ratio [OR] 3·3 [95% CI 1·4-7·6], p=0·005) and was greater in radiocephalic fistulae (20 [77%] of 26 patients vs 12 [48%] of 25; OR 3·6 [1·4-3·6], p=0·03). There were no significant adverse events related to the procedure. INTERPRETATION Compared with local anaesthesia, BPB significantly improved 3 month primary patency rates for arteriovenous fistulae. FUNDING Regional Anaesthesia UK, Darlindas Charity for Renal Research.
Transplantation | 2017
H. Whalen; J. Glen; Victoria Harkins; Katherine K. Stevens; Alan G. Jardine; Colin C. Geddes; Marc Clancy
BackgroundHigh intrapatient tacrolimus variability has been associated with worse clinical outcomes postrenal transplantation. Theoretically, tacrolimus levels consistently outside the target therapeutic window may result in allograft dysfunction as subtherapeutic tacrolimus levels predispose to episodes of acute rejection, whereas supratherapeutic levels may cause nephrotoxicity. MethodsWe investigated the effect of tacrolimus variability in a “Symphony” style low-dose tacrolimus based regime, by collecting data from 432 patients over a 4-year period.Three hundred seventy-six patients were included, with a mean follow-up of 1495 days. Tacrolimus variability 6 to 12 months after renal transplantation was calculated, and outcomes were compared in low (n = 186) and high variability (n = 190) groups. ResultsHigh variability patients were found to be at increased risk of rejection during the first posttransplant year (P = 0.0054) and to have reduced rejection-free survival (hazard ratio, 1.953; 95% confidence interval, 1.234-3.093; P = 0.0054). High variability patients had significantly worse (P < 0.0001) glomerular filtration rates at 1, 2, 3, and 4 years posttransplant. High variability patients were at increased risk of allograft loss (hazard ratio, 4.928; 95% confidence interval, 2.050-11.85; P = 0.0004). ConclusionsThis suggests that highly variable tacrolimus levels predict worse outcomes postrenal transplantation, although the causal nature of this relationship remains unclear. High tacrolimus variability may identify a subset of patients who warrant increased surveillance and patient education regarding dietary and medication compliance.
Transplantation Reviews | 2013
Richard Haynes; Colin Baigent; Paul Harden; Martin J. Landray; Murat Akyol; Argiris Asderakis; Alex Baxter; Sunil Bhandari; Paramit Chowdhury; Marc Clancy; Jonathan Emberson; Paul Gibbs; Abdul Hammad; William G. Herrington; Kathy Jayne; Gareth Jones; N. Krishnan; Michael Lay; David Lewis; Iain C. Macdougall; Chidambaram Nathan; James Neuberger; C. Newstead; R. Pararajasingam; Carmelo Puliatti; Keith Rigg; Peter Rowe; Adnan Sharif; Neil S. Sheerin; Sanjay Sinha
BackgroundKidney transplantation is the best treatment for patients with end-stage renal failure, but uncertainty remains about the best immunosuppression strategy. Long-term graft survival has not improved substantially, and one possible explanation is calcineurin inhibitor (CNI) nephrotoxicity. CNI exposure could be minimized by using more potent induction therapy or alternative maintenance therapy to remove CNIs completely. However, the safety and efficacy of such strategies are unknown.Methods/DesignThe Campath, Calcineurin inhibitor reduction and Chronic allograft nephropathy (3C) Study is a multicentre, open-label, randomized controlled trial with 852 participants which is addressing two important questions in kidney transplantation. The first question is whether a Campath (alemtuzumab)-based induction therapy strategy is superior to basiliximab-based therapy, and the second is whether, from 6 months after transplantation, a sirolimus-based maintenance therapy strategy is superior to tacrolimus-based therapy. Recruitment is complete, and follow-up will continue for around 5 years post-transplant. The primary endpoint for the induction therapy comparison is biopsy-proven acute rejection by 6 months, and the primary endpoint for the maintenance therapy comparison is change in estimated glomerular filtration rate from baseline to 2 years after transplantation. The study is sponsored by the University of Oxford and endorsed by the British Transplantation Society, and 18 centers for adult kidney transplant are participating.DiscussionLate graft failure is a major issue for kidney-transplant recipients. If our hypothesis that minimizing CNI exposure with Campath-based induction therapy and/or an elective conversion to sirolimus-based maintenance therapy can improve long-term graft function and survival is correct, then patients should experience better graft function for longer. A positive outcome could change clinical practice in kidney transplantation.Trial registrationClinicalTrials.gov, NCT01120028 and ISRCTN88894088
Nephrology Dialysis Transplantation | 2011
Kathryn K. Stevens; Y. Mun Woo; Marc Clancy; John McClure; Jonathan G. Fox; Colin C. Geddes
BACKGROUND Increasing numbers of older patients are developing established renal failure and considering kidney transplant as a renal replacement therapy (RRT) option. The probability of older patients actually receiving a deceased donor kidney transplant is unclear, preventing informed choice about pursuing the option of transplantation. We sought to analyse our RRT population to determine the probability of receiving a deceased donor kidney transplant in patients commencing RRT categorized by age and for whom there was no suitable living kidney donor. METHODS Patients commencing dialysis in our centre between 1992 and 2009 were identified. Time to listing on the deceased donor transplant waiting list and time to first deceased donor transplant were determined by Kaplan-Meier analysis for patients, categorized by age, with censoring at the date of first living donor kidney transplant, death or last dialysis. RESULTS One-thousand-five-hundred-and-thirteen patients were categorized into groups by age in years [1: <35 (n = 134), 2: 35-49.9 (n = 207), 3: 50-64.9 (n = 415), 4: >65-74.9 (n = 438) and 5: ≥ 75 (n = 319)]. The probability of being listed for deceased donor transplant within 1 year of commencing RRT was 75, 54, 27, 4 and 0.8% in Groups 1-5, respectively. If listed, the probability of receiving a deceased donor transplant within 5 years of starting RRT was 81, 48, 26, 8 and 0% in Groups 1-5, respectively. In Groups 1-4, 93% (n = 63), 87% (n = 65), 76% (n = 45) and 100% (n = 7) of the patients, respectively, who received a deceased donor transplant were alive and off dialysis 1 year after transplant. The reason patients who were listed did not receive a transplant was usually death on the waiting list. CONCLUSIONS The likelihood of being listed for transplant falls with increasing age at the time of starting RRT. Even for patients listed for transplant, the probability of older patients actually receiving a transplant is much lower than for younger patients, with only 8% of listed patients aged 65-74.9 years being transplanted within 5 years. This is partly the result of death on the waiting list but may also be related to organ allocation policies. Assessment for possible deceased donor transplantation involves a considerable investment in time and effort for the patient, as well as in health care resources, and a patients decision whether to proceed with assessment should be informed by the kind of information we have produced. As there may be regional and national variations in practice, each centre should generate such data for use locally.
PLOS ONE | 2016
Dagmara McGuinness; Johannes Leierer; Olivier Shapter; Suhaib Mohammed; Marc Gingell-Littlejohn; David Kingsmore; Ann-Margaret Little; Julia Kerschbaum; Stefan Schneeberger; Manuel Maglione; Silvio Nadalin; Sylvia Wagner; Alfred Königsrainer; Emma Aitken; H. Whalen; Marc Clancy; Alex McConnachie; Christian Koppelstaetter; Karen Stevenson; Paul G. Shiels
Introduction Delayed graft function is a prevalent clinical problem in renal transplantation for which there is no objective system to predict occurrence in advance. It can result in a significant increase in the necessity for hospitalisation post-transplant and is a significant risk factor for other post-transplant complications. Methodology The importance of microRNAs (miRNAs), a specific subclass of small RNA, have been clearly demonstrated to influence many pathways in health and disease. To investigate the influence of miRNAs on renal allograft performance post-transplant, the expression of a panel of miRNAs in pre-transplant renal biopsies was measured using qPCR. Expression was then related to clinical parameters and outcomes in two independent renal transplant cohorts. Results Here we demonstrate, in two independent cohorts of pre-implantation human renal allograft biopsies, that a novel pre-transplant renal performance scoring system (GRPSS), can determine the occurrence of DGF with a high sensitivity (>90%) and specificity (>60%) for donor allografts pre-transplant, using just three senescence associated microRNAs combined with donor age and type of organ donation. Conclusion These results demonstrate a relationship between pre-transplant microRNA expression levels, cellular biological ageing pathways and clinical outcomes for renal transplantation. They provide for a simple, rapid quantitative molecular pre-transplant assay to determine post-transplant allograft function and scope for future intervention. Furthermore, these results demonstrate the involvement of senescence pathways in ischaemic injury during the organ transplantation process and an indication of accelerated bio-ageing as a consequence of both warm and cold ischaemia.
Journal of Vascular Access | 2014
Emma Aitken; Karen Stevenson; Marc Gingell-Littlejohn; Margaret Aitken; Marc Clancy; David Kingsmore
Purpose To evaluate reasons for tunneled central venous catheter (TCVC) usage in our prevalent hemodialysis population and assess the impact of a surgically aggressive approach to definitive access creation. Methods Clinical review of all patients in the West of Scotland dialyzing via a TCVC in November 2010 was performed. Reasons for TCVC usage and TCVC complications were evaluated. Over the subsequent year, aggressive intervention was undertaken to achieve definitive access in all suitable patients and outcomes re-evaluated a year later (November 2011). Results There was no significant difference in the proportion of patients dialyzing via a TCVC in 2010 compared to 2011 (30.3% (n=193) vs. 31.7% (n=201), respectively; p=0.56). All patients now have a “vascular access plan.” Of patients dialyzing via a TCVC in 2010, 37% had died by 2011, 22% remained on long-term line, 20% had successful arteriovenous fistula (AVF) creation, 1% had an arteriovenous graft and 2% were transplanted; 10.4% developed complications of vascular access and required ligation of a functioning AVF. A further 6.5% died within 28 days of surgery. The incidence of culture-positive Staphylococcus aureus bacteremia was 1.6 per 1,000 catheter days. Conclusions Aggressive strategies of AVF creation resulted in one-fifth of patients on a long-term TCVC having successful creation of an AVF. This was offset against high failure and significant complication rate from AVF creation in this population. One-third of patients dialyzing via a TCVC died in the subsequent year. Correct patient selection for AVF creation is essential and predialysis care must be optimized to avoid the need for TCVCs entirely.
Clinical Nephrology | 2013
Emma Aitken; Angus McLellan; J. Glen; Mick Serpell; Robert Mactier; Marc Clancy
INTRODUCTION The burden of pain from cannulation of arteriovenous fistulae (AVF) and the impact it has on quality of life is poorly described in the literature. METHODOLOGY A pain score questionnaire was employed for all patients in the West of Scotland dialyzing via AVF (n = 461). Pain was assessed using visual analogue score (VAS) and McGill pain score. Patients with severe pain (VAS > 5) were compared to those with minimal pain. RESULTS The questionnaire as completed by 97.5% of the patients. Median VAS on cannulation was 3 (IQR 0.5 - 4.5). Of those who had completed the questionnaire, 24.4% had severe pain on cannulation and 3.2% experienced severe chronic pain. 53 patients (11.3%) cut a dialysis session short due to pain. Of the patients with severe chronic pain, 46.7% had a physical complication affecting their AVF (e.g., venous stenosis, pseudoaneurysm). Following treatment of the problem, pain improved in 71.4% and resolved completely in 14.3%. Brachiobasilic AVF was associated with a higher incidence of severe pain than either brachiocephalic or radiocephalic AVF (50%, 23.3% and 24.4% respectively; p = 0.03). There was a trend towards more severe pain with rope-ladder cannulation (27.7%) compared to button-hole cannulation (18.2%); however, this difference did not reach statistical significance (p = 0.09). CONCLUSIONS Pain from AVF is poorly recognized and an under-reported problem. While severe pain resulting in the avoidance of dialysis is rare, it can lead to significant difficulties and ultimate abandonment of AVF. Pain is often suggestive of an underlying anatomical problem.
International Journal of Surgery | 2011
F. Hanif; A.N. Macrae; M.G. Littlejohn; Marc Clancy; E. Murio
AIMS This paper presents an e-survey of current clinical practice of use of intra-operative diuretics during renal transplantation in the United Kingdom and a study to compare outcome of renal transplants carried out with or without intra-operative diuretics in our centre. METHODS An e-mail questionnaire to renal transplant surgeons exploring their practice of renal transplantation with or without intra-operative diuretics, the type of a diuretic/s if used and the relevant doses. An observational study comparing the outcome of renal transplant recipients, group no-diuretics (GND, n = 80) carried out from 2004 to 2008 versus group diuretics (GD n = 69) renal transplant recipients who received intra-operative diuretics over a one year period is presented. Outcome measures were incidence of delayed graft function and a comparison of graft survival in both groups. RESULTS Forty surgeons answered from 18 transplant centres with a response rate of 67%. 13 surgeons do not use diuretics. Mannitol is used by 10/40, Furosemide 6/40 and 11 surgeons use a combination of both. In comparative study there was no significant overall difference in one year graft survival of GD versus GND (N = 65/69, 94% and 75/80, 94% respectively, p = 0.08) and the incidence of delayed graft function was also comparable (16/69, 23% and 21/80, 26% respectively, p = 0.07). The donor characteristics in both groups were comparable. CONCLUSION The study showed variation in clinical practice on the use of intra-operative diuretics in renal transplantation and it did not demonstrate that the use of diuretics can improve renal graft survival.