Thomas D. Johnston
University of Kentucky
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Thomas D. Johnston.
American Journal of Transplantation | 2010
H. Tedesco Silva; Diane M. Cibrik; Thomas D. Johnston; E. Lackova; K. Mange; C. Panis; Rowan G. Walker; Z. Wang; Gazi B. Zibari; Yu Seun Kim
Everolimus allows calcineurin‐inhibitor reduction without loss of efficacy and may improve renal‐transplant outcomes. In a 24‐month, open‐label study, 833 de novo renal‐transplant recipients were randomized to everolimus 1.5 or 3.0 mg/day (target troughs 3–8 and 6–12 ng/mL, respectively) with reduced‐exposure CsA, or mycophenolic acid (MPA) 1.44 g/day plus standard‐exposure CsA. Patients received basiliximab ± corticosteroids. The primary endpoint was composite efficacy failure (treated biopsy‐proven acute rejection, graft loss, death or loss to follow‐up) and the main safety endpoint was renal function (estimated glomerular filtration rate [eGFR], by Modification of Diet in Renal Disease [MDRD]) at Month 12 (last‐observation‐carried‐forward analyses). Month 12 efficacy failure rates were noninferior in the everolimus 1.5 mg (25.3%) and 3.0 mg (21.9%) versus MPA (24.2%) groups. Mean eGFR at Month 12 was noninferior in the everolimus groups versus the MPA group (54.6 and 51.3 vs 52.2 mL/min/1.73 m2 in the everolimus 1.5 mg, 3.0 mg and MPA groups, respectively; 95% confidence intervals for everolimus 1.5 mg and 3.0 mg vs MPA: −1.7, 6.4 and −5.0, 3.2, respectively). The overall incidence of adverse events was comparable between groups. The use of everolimus with progressive reduction in CsA exposure, up to 60% at 1 year, resulted in similar efficacy and renal function compared with standard‐exposure CsA plus MPA.
International Journal of Pharmaceutics | 2009
Changguo Chen; Thomas D. Johnston; Hoonbae Jeon; Roberto Gedaly; Patrick P. McHugh; Thomas G. Burke; Dinesh Ranjan
Curcumin is a multi-functional and pharmacologically safe natural agent. Used as a food additive for centuries, it also has anti-inflammatory, anti-virus and anti-tumor properties. We previously found that it is a potent inhibitor of cyclosporin A (CsA)-resistant T-cell co-stimulation pathway. It inhibits mitogen-stimulated lymphocyte proliferation, NFkappaB activation and IL-2 signaling. In spite of its safety and efficacy, the in vivo bioavailability of curcumin is poor, and this may be a major obstacle to its utility as a therapeutic agent. Liposomes are known to be excellent carriers for drug delivery. In this in vitro study, we report the effects of different liposome formulations on curcumin stability in phosphate buffered saline (PBS), human blood, plasma and culture medium RPMI-1640+10% FBS (pH 7.4, 37 degrees C). Liposomal curcumin had higher stability than free curcumin in PBS. Liposomal and free curcumin had similar stability in human blood, plasma and RPMI-1640+10% FBS. We looked at the toxicity of non-drug-containing liposomes on (3)H-thymidine incorporation by concanavalin A (Con A)-stimulated human lymphocytes, splenocytes and Epstein-Barr virus (EBV)-transformed human B-cell lymphoblastoid cell line (LCL). We found that dimyristoylphosphatidylcholine (DMPC) and dimyristoylphosphatidylglycerol (DMPG) were toxic to the tested cells. However, addition of cholesterol to the lipids at DMPC:DMPG:cholesterol=7:1:8 (molar ratio) almost completely eliminated the lipid toxicity to these cells. Liposomal curcumin had similar or even stronger inhibitory effects on Con A-stimulated human lymphocyte, splenocyte and LCL proliferation. We conclude that liposomal curcumin may be useful for intravenous administration to improve the bioavailability and efficacy, facilitating in vivo studies that could ultimately lead to clinical application of curcumin.
Clinical Transplantation | 2000
K. Sudhakar Reddy; Thomas D. Johnston; Lee Ann Putnam; Michael Isley; Dinesh Ranjan
Background. The piggyback technique (PT), with preservation of the cava, is being used more frequently in adult orthotopic liver transplantation (OLT). The advantages of PT include hemodynamic stability during the anhepatic phase without a large‐volume fluid infusion and obviating the need for veno‐venous bypass (VVB). At our center, we changed our practice in July 1997 from the standard technique (ST) of OLT with routine use of VVB to PT and selective use of VVB. The purpose of the present study was to analyze the results with the two different practices, ST‐routine VVB versus PT‐selective VVB. Methods. Forty OLTs were performed during the period July 1995–July 1997 using ST‐routine VVB (group I) and 36 during August 1997–December 1998 using PT‐selective VVB (group II). The etiology of liver disease was similar in the two groups, with hepatitis C and alcoholic liver disease accounting for half of the patients in each group. The UNOS status, age, sex, and percentage of patients with previous upper abdominal surgery were also similar between the two groups. Results. In the PT‐selective VVB era (group II), 34/36 patients (94%) underwent OLT with PT and VVB was used for 8 (22%) patients. The decision to use VVB was elective for 3 patients (fulminant hepatic failure, 2; severe portal hypertension, 1) and urgent for 5 patients (hemodynamic instability during hepatectomy). The intraoperative use of packed red blood cells (PRBC) (mean±SD) was 15±12 units for group I and 9±8 units for group II (p=0.023). Anastomosis time and total operating time (mean±SD) were 91±30 min and 9.5±3.2 h, respectively, for group I patients compared with 52±28 min and 7.6±1.6 h, respectively, for group II patients (p<0.0001 and 0.002, respectively). Median post‐operative stays in the intensive care unit (ICU) and in the hospital were 5 and 17 d, respectively, for group I and 4 and 11 d, respectively, for group II (p=NS). Mean serum creatinine on day 3 was similar in the two groups. Median hospital charges for group I patients were
Transplantation | 2008
Roberto Gedaly; Patrick P. McHugh; Thomas D. Johnston; Hoonbae Jeon; Alvaro Koch; Timothy M. Clifford; Dinesh Ranjan
105 439 compared with
Clinical Transplantation | 2000
Thomas D. Johnston; Robert Gates; K. Sudhakar Reddy; Nicholas Nickl; Dinesh Ranjan
91 779 for group II patients (p=NS). The 1‐year actuarial graft and patient survival rates were 78% and 82%, respectively, for group I, and 92% and 95%, respectively, for group II. Conclusions. PT is safe and can be performed in the majority of adult patients (>90%) undergoing OLT. With the routine application of the piggyback procedure, the use of VVB has been reduced to 20% of OLTs at our center. The practice of piggyback technique with the selective use of VVB is associated with shorter anhepatic phase and total operating time, lower blood product use, a trend towards shorter hospital length of stay, and reduced hospital charges compared with standard technique of OLT with routine use of VVB.
Annals of Surgery | 2009
Roberto Gedaly; Patrick P. McHugh; Thomas D. Johnston; Hoonbae Jeon; Dinesh Ranjan; Daniel L. Davenport
Background. Alcoholic liver disease (ALD) is a common indication for transplantation worldwide. This study identifies factors predicting posttransplant recidivism. Methods. Clinical and laboratory data were reviewed. Uni- and multivariate analyses for survival and relapse to alcohol and illicit drugs were performed. Result. Between July 1995 and November 2007, 387 patients underwent liver transplantation at our institution. Of these, 147 patients (38%) were found to have ALD. Five patients (3.4%) were excluded because of perioperative mortality. Overall survival was 96.2%, 89.6%, and 84.4% at 1, 3, and 5 years, respectively, with a median follow-up of 41.2 months. Twenty-seven patients (19%) returned to alcohol after transplantation. By univariate analysis, depression was the only significant factor affecting survival (P=0.01), whereas posttransplant relapse to alcohol trended toward significance (P=0.059). Multivariate analysis showed both factors to be independently associated with poor survival (P=0.008 and 0.017, respectively). Factors associated with relapse included less than 12 months of abstinence before transplant (P=0.019) and participation in rehabilitation (P=0.026). Multivariate analysis showed pretransplant abstinence less than 12 months as the only independent factor (P=0.037) associated with alcohol relapse after transplantation. Twenty-five patients (17.2%) had documented drug use after transplantation. Drug abuse before transplantation was the only independent predictor of drug abuse after transplantation (P=0.017). Conclusions. Excellent results can be obtained in patients undergoing liver transplantation for ALD, though depression and recidivism adversely impact survival. In our series, abstinence less than 12 months was associated with relapse to alcohol. Similarly, those with prior drug abuse are more likely to continue drug use after transplantation.
Clinical Transplantation | 2004
Thomas D. Johnston; Leroy R. Thacker; Hoonbae Jeon; Bruce A. Lucas; Dinesh Ranjan
The biliary anastomosis has been called ‘the Achilles heel’ of liver transplantation (Rabkin JM, Orloff SL, Reed MH. Transplantation 1998: 65 [2]: 193; Davidson BR, Rai R, Kurzawinski TR. Br J Surg 1999: 86 [4]: 447). Biliary complications after liver transplantation reportedly occur at an incidence of 20–30%, 10–15% as bile leaks. The management of bile leaks, especially early bile leaks, is controversial. In the present study, we report our experience with the management of bile leaks after liver transplantation. In this retrospective study, we reviewed 85 liver transplants over a 3‐yr period. In 79, the biliary anastomosis was choledochocholedochostomy (CDCD) over a small‐caliber T‐tube, while choledochojejunostomy (CDJ) was used in 7. Over a mean follow up period of 13.5 months (median 10 months), 10 patients (12%) experienced a clinically significant bile leak within the first 3 months after liver transplantation. The early leaks, occurring within 1 month of transplant, were successfully managed by observation (Davidson BR, Rai R, Kurzawinski TR. Br J Surg 1999: 86 [4]: 447) or endoscopic retrograde cholangiopancreatography (ERCP) and the placement of a biliary stent for a duration of 6–12 wk (Randall HB, Wachs ME, Somberg KA. Transplantation 1996: 61 [2]: 258). One of these resulted from accidental dislodgement of the T‐tube on postoperative day 1; one resulted from necrosis at the CDCD anastomosis and required CDJ; the remaining four resulted from leaks along the T‐tube track. One of the late leaks occurred following the planned removal of the T‐tube at 3 months after liver transplantation; the other two were leaks along the T‐tube track. All were successfully treated by ERCP and stent placement, though in one case, ERCP was initially unsuccessful because of the inability to advance a guidewire, necessitating a fluoroscopically aided guide wire placement during a mini laparotomy. ERCP was then successfully performed with the placement of a stent. Table 1Conclusions: Our experience indicates that most bile leaks after liver transplantation, including early leaks, can be successfully managed nonoperatively. Most will require intervention, but ERCP and stent placement are usually sufficient.
American Journal of Transplantation | 2003
K. Sudhakar Reddy; Darcy Davies; Debra Ormond; Sony Tuteja; Bruce A. Lucas; Thomas D. Johnston; Thomas Waid; John McKeown; Dinesh Ranjan
Objective:To investigate independent contributions of obesity, diabetes, and smoking to resource utilization in patients following liver resection. Summary Background Data:Despite being highly resource-intensive, liver resections are performed with increasing frequency. This study evaluates how potentially modifiable factors affect measures of resource utilization after hepatectomy. Methods:The American College of Surgeons’ National Surgical Quality Improvement Program (ACS NSQIP) public-use database was queried for patients undergoing liver resection. Resource variables were operative time (OT), intraoperative transfusion, length of stay (LOS), ventilator support at 48 hours, and reoperation. Bivariable and multivariable linear and logistic regressions were performed. Results:There were 1029 patients identified. Most resections involved less than a hemiliver (599 patients, 58.2%). Mean BMI was 28.0 ± 6.0. Mean OT was 253 ± 122 minutes (range, 27 to 794) but varied by procedure (P < 0.001). Mean LOS was 8.7 ± 10.7 days (range, 0 to 202). Morbid obesity added 48 minutes to OT (P = 0.018), 1.1 units to transfusions (P = 0.049), 2.2 days to LOS (P < 0.001), and accounted for delayed ventilator weaning (odds ratio, 4.5; P = 0.022). Underweight patients had shorter OT, but stayed 3.3 days longer than normal weight patients (P < 0.001). Insulin-treated patients with diabetes had longer OT (P < 0.001), increased transfusions (P < 0.001), and delayed ventilator weaning (odds ratio, 6.7; P < 0.001), while orally-treated patients with diabetes showed opposite trends. Smokers stayed 1.9 days longer (P < 0.001), with increased risk of prolonged ventilation (odds ratio, 3.3; P = 0.002) and reoperation (odds ratio, 2.3; P = 0.015). Conclusion:Obesity, diabetes, and smoking are each associated with important components of healthcare expenditure. Education and prevention programs are needed to limit their impact on overall resource utilization.
Clinical Transplantation | 2001
Thomas D. Johnston; Kunam S. Reddy; Michael J. Mastrangelo; Bruce A. Lucas; Dinesh Ranjan
Abstract: The United Network for Organ Sharing (UNOS), working in conjunction with organ procurement organizations and transplant programmes, has recently defined a class of cadaver kidney grafts for special allocation procedures to enhance utilization of those organs. The criteria defining these expanded‐criteria donor (ECD) kidneys are donor age ≥ 60 yr or donor age between 50 and 59 yr plus two of the following characteristics: donor history of cerebrovascular accident (CVA), donor history of hypertension (htn), and elevated creatinine (>1.5) at any time during donor management. Kidney grafts from ECD donors carry an increased relative risk of non‐function compared to other cadaver kidney grafts. The goal of the special allocation procedure is to reduce the time associated with placement by matching ECD grafts with patients previously designated as being willing to accept them. In assessing the potential impact of these allocation procedures, the sensitivity of ECD grafts to cold ischaemia time (CIT) became of great significance. Specifically, we questioned whether minimization of CIT might reduce the relative risk of poor graft function, justifying reduction of the geographical range of placement and thereby reducing the time the grafts would spend in‐transit.
Transplant International | 2008
Roberto Gedaly; Timothy M. Clifford; Patrick P. McHugh; Hoonbae Jeon; Thomas D. Johnston; Dinesh Ranjan
Although it is well established that acute rejection is one of the major risk factors for chronic graft loss following kidney transplantation, its effect on long‐term graft survival following simultaneous kidney‐pancreas transplants (SKPTs) is less well known. We analyzed a large cohort of SKPTs and cadaver kidney transplants reported to the United Network for Organ Sharing database during 1988–97, to determine the impact of acute rejection episodes on long‐term kidney and pancreas graft survival. Only patients whose kidney and pancreas grafts had survived for at least 1 year were included. Other potential risk factors influencing long‐term graft survival were included in the analysis. Of the 4251 SKPTs, 45% had no acute rejection, 36% had kidney only rejection, 3% had pancreas only rejection, and 16% had both kidney and pancreas rejection within the 1st year post transplant. The 5‐year kidney and pancreas graft survival rates adjusted for other risk factors were 91% and 85%, respectively; for those with no acute rejection episodes, 88% and 84%, respectively; for those with kidney only rejection, 94% and 83%, respectively; for those with pancreas only rejection; and 86% and 78%, respectively, for those with both kidney and pancreas rejection. The relative risk (RR) of kidney graft failure was 1.32 when acute rejection involved the kidney graft only, while the RR was 1.53 when the rejection involved both organs. We conclude that acute rejection episodes have a negative impact on the long‐term kidney graft survival in the SKPT population similar to that in the cadaver kidney transplant population. Patients who had acute rejection episodes of both kidney and pancreas have the worst long‐term graft survival.