Warren R. Maley
Thomas Jefferson University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Warren R. Maley.
Transplantation | 2000
Robert A. Montgomery; Andrea A. Zachary; Lorraine C. Racusen; Mary S. Leffell; Karen E. King; James F. Burdick; Warren R. Maley; Lloyd E. Ratner
Background. Hyperacute rejection (HAR) and acute humoral rejection (AHR) remain recalcitrant conditions without effective treatments, and usually result in graft loss.Plasmapheresis (PP) has been shown to remove HLA- specific antibody (Ab) in many different clinical settings. Intravenous gamma globulin (IVIG) has been used to suppress alloantibody and modulate immune responses. Our hypothesis was that a combination of PP and IVIG could effectively and durably remove donor-specific, anti-HLA antibody (Ab), rescuing patients with established AHR and preemptively desensitizing recipients who had positive cross-matches with a potential live donor. Methods. The study patients consisted of seven live donor kidney transplant recipients who experienced AHR and had donor-specific Ab (DSA) for one or more mismatched donor HLA antigens. The patients segregated into two groups: three patients were treated for established AHR (rescue group) and four cross-match-positive patients received therapy before transplantation (preemptive group). Results. Using PP/IVIG we have successfully reversed established AHR in three patients. Four patients who were cross-match-positive (3 by flow cytometry and 1 by cytotoxic assay) and had DSA before treatment underwent successful renal transplantation utilizing their live donor. The overall mean creatinine for both treatment groups is 1.4±0.8 with a mean follow up of 58±40 weeks (range 17–116 weeks). Conclusions. In this study, we present seven patients for whom the combined therapies of PP/IVIG were successful in reversing AHR mediated by Ab specific for donor HLA antigens. Furthermore, this protocol shows promise for eliminating DSA preemptively among patients with low-titer positive antihuman globulin-enhanced, complement-dependent cytotoxicity (AHG-CDC) cross-matches, allowing the successful transplantation of these patients using a live donor without any cases of HAR.
The Lancet | 2006
Robert A. Montgomery; Sommer E. Gentry; William H. Marks; Daniel S. Warren; Janet Hiller; Julie A. Houp; Andrea A. Zachary; J. Keith Melancon; Warren R. Maley; Hamid Rabb; Christopher E. Simpkins; Dorry L. Segev
Current models for allocation of kidneys from living non-directed donors Living non-directed (LND) donors, also known as altruistic, good Samaritan, anonymous, or benevolent community donors, are a new and rapidly growing source of solid organs for transplantation. The willingness of individuals to donate organs without a designated recipient has been unexpected, but has probably developed as a societal response to the growing crisis in organ availability. In the context of this shortage, health professionals have attempted to make the best use of kidneys from LND donors. We present a novel application of paired donation that has the potential to multiply the number of recipients who can benefi t from each LND donation. At present, there is no universally accepted system for allocation of organs from LND donors. Selection of recipients has been at the discretion of the transplant centres where LND donors have presented and has generally been guided by one of three models: donorcentric, recipient-centric, or sociocentric allocation. Each of these models is supported by valid ethical arguments. The main goal of donor-centric allocation is to ensure a successful outcome for the recipient. A good outcome provides justifi cation for medical professionals to assist a person who is not ill to put themselves in harm’s way to aid another. A positive result also gives an LND donor a sense that their eff ort was fruitful and worthwhile. However, this model dictates allocation to the healthiest patient on the transplant waiting list. These recipients are the most likely to have good outcomes on dialysis or with organs from deceased donors, and therefore are arguably the least in need. Recipient-centric allocation is based on the belief that society has a responsibility to protect its most vulnerable and disadvantaged members. Under this model, organs from LND donors are given to those patients in the greatest need, those for whom a kidney transplant might be truly life saving, or those disadvantaged under the existing system for allocation of kidneys from deceased donors. This model mainly benefi ts children, patients who have no vascular access, highly sensitised patients, and those with life-threatening medical illnesses related to dialysis. However, because the recipient-centric model accords priority to such patients, it tends to yield unacceptably poor transplant outcomes, and could lead to a negative public perception of LND donation. Under the third model, of sociocentric allocation, the LND donated organ is treated as a public resource that should be allocated in the fairest and most equitable way, irrespective of outcome or need. This rationale dictates that the recipient should be the patient at the top of the transplant waiting list administered by the United Network of Organ Sharing (UNOS). UNOS oversees the allocation of deceased donor organs in the USA, using a so-called match run algorithm that ranks potential recipients according to agreed criteria. The limitations of this model are that a patient at the top of the list will probably receive a kidney from a deceased donor in the near future, and that they will have already incurred the costs, and exposure to comorbidity, that result from a long period on dialysis. The waiting list for deceased donor kidneys can be circumvented by patients who fi nd a willing live donor. But direct donation might be complicated by diff erences in blood type and by HLA sensitivity. Some incompatible donor-recipient pairs enter into programmes that facilitate paired donation, also known as kidney paired donation. A donor and recipient who have incompatible blood groups or HLA sensitivity can be matched with another incompatible pair, to result in two compatible transplants (fi gure). Although there are many ways to match up a pool of incompatible pairs, the mathematical technique of optimisation helps to fi nd out which matches will yield the best results. Nevertheless, even in paired-donation programmes in which mathematical optimisation is applied, more than 50% of the incompatible pairs in the pool remain unmatched. In many cases, pools of incompatible donor-recipient pairs have a high proportion of patients with blood types that are hard to match and those with HLA sensitisation.
Annals of Surgery | 2007
Timothy M. Pawlik; Ana L. Gleisner; Robert A. Anders; Lia Assumpcao; Warren R. Maley; Michael A. Choti
Objective:To examine the diagnostic agreement of preoperative needle core biopsy (NCB) grading of hepatocellular carcinoma (HCC) compared with the final surgical pathologic tumor grade. Summary Background Data:Some centers have adopted protocols for selecting patients with HCC for transplantation based on tumor grade as determined by preoperative NCB. The validity of NCB to predict final tumor grade has not been previously assessed. Methods:A total of 211 patients who underwent hepatic resection, open radiofrequency, or transplantation for HCC between 1998 and 2004 were identified. Clinicopathologic, NCB, and surgical data were collected and analyzed using χ2 and κ statistics. Results:A total of 120 (67.4%) of the 178 who underwent resection or transplantation had an NCB. On preoperative NCB, the majority of HCC cases were classified as well-differentiated (n = 35; 37.6%) or moderately differentiated (n = 44; 47.3%), while 14 (15.1%) cases were categorized as poorly differentiated. In contrast, when tumor grading was based on the final surgical specimen, there was a significantly higher proportion of HCC cases graded as poorly differentiated (well-differentiated, n = 34; 36.6%; moderately differentiated, n = 33; 35.5%; poorly differentiated, n = 26; 27.9%) (P < 0.05). The overall percent agreement of NCB and surgical pathology to determine tumor grade was poor (κ = 0.18, P < 0.0001). Whereas final pathologic tumor grade predicted the presence of microscopic vascular invasion (well, 15.7%; moderate; 31.9%, poor; 58.4%; P = 0.001), NCB grade did not (well, 23.7%; moderate, 28.0%; poor, 25.4%; P = 0.65). Conclusions:Selection of candidates for transplantation based on NCB tumor grade may be misleading, as NCB tumor grade often did not correlate with grade or presence of microscopic vascular invasion on final pathology. Clinicomorphologic criteria (tumor size, number) should remain the major determinants of eligibility for transplantation.
Liver Transplantation | 2007
Anurag Maheshwari; Warren R. Maley; Zhiping Li; Paul J. Thuluvath
Biliary complications after liver transplantation (LT) using organs retrieved from donors after cardiac death are not well characterized. The aim of this study was to evaluate the severity of biliary complications and outcomes after donation after cardiac death liver transplantation (DCD‐LT). A retrospective evaluation of 20 DCD‐LTs from 1997–2006 was performed. The recipient age was 53 ± 8.7, and the donor age was 35 ± 11 years. The warm ischemia time, cold ischemia time, peak alanine aminotransferase level, and peak aspartate aminotransferase level were 33 ± 12 minutes, 8.7 ± 2.7 hours, 1757 ± 1477 U/L, and 4020 ± 3693 U/L, respectively. The bilirubin and alkaline phosphatase levels at hospital discharge after LT were 3.2 ± 5.4 mg/dL and 248 ± 200 U/L, respectively. During a median follow‐up of 7.5 months (range: 1–73), 5 patients (25%; 1 death after re‐LT) died (3 from sepsis, 1 from recurrent hepatocellular carcinoma at 4 months, and 1 from a cardiac event at 46 months), and additionally, 4 patients (20%) required re‐LT (1 because of hepatic artery thrombosis, 1 because of primary graft nonfunction, and 2 because of biliary strictures). Twelve (60%) developed biliary complications, and of these, 11 (55%) had serious biliary complications. The biliary complications were as follows: a major bile leak for 2 patients (10%; both eventually underwent retransplantation), anastomotic strictures for 5 patients (25%), hilar strictures for 7 patients (35%), extrahepatic donor duct strictures for 9 patients (45%), intrahepatic strictures for 10 patients (50%), stones for 1 patients (5%), casts for 7 patients (35%), and debris for 2 patients (10%). More than 1 biliary complication was seen in most patients, and these were unpredictable and required multiple diagnostic or therapeutic procedures. Serious biliary complications are common after DCD‐LT, and research should focus on identifying donor and recipient factors that predict and prevent serious biliary complications. Liver Transpl 13:1645–1653, 2007.
Liver Transplantation | 2008
Dorry L. Segev; Stephen M. Sozio; Eun Ji Shin; Susanna M. Nazarian; Hari Nathan; Paul J. Thuluvath; Robert A. Montgomery; Andrew M. Cameron; Warren R. Maley
Steroid use after liver transplantation (LT) has been associated with diabetes, hypertension, hyperlipidemia, obesity, and hepatitis C (HCV) recurrence. We performed meta‐analysis and meta‐regression of 30 publications representing 19 randomized trials that compared steroid‐free with steroid‐based immunosuppression (IS). There were no differences in death, graft loss, and infection. Steroid‐free recipients demonstrated a trend toward reduced hypertension [relative risk (RR) 0.84, P = 0.08], and statistically significant decreases in cholesterol (standard mean difference −0.41, P < 0.001) and cytomegalovirus (RR 0.52, P = 0.001). In studies where steroids were replaced by another IS agent, the risks of diabetes (RR 0.29, P < 0.001), rejection (RR 0.68, P = 0.03), and severe rejection (RR 0.37, P = 0.001) were markedly lower in steroid‐free arms. In studies in which steroids were not replaced, rejection rates were higher in steroid‐free arms (RR 1.31, P = 0.02) and reduction of diabetes was attenuated (RR 0.74, P = 0.2). HCV recurrence was lower with steroid avoidance and, although no individual trial reached statistical significance, meta‐analysis demonstrated this important effect (RR 0.90, P = 0.03). However, we emphasize the heterogeneity of trials performed to date and, as such, do not recommend basing clinical guidelines on our conclusions. We believe that a large, multicenter trial will better define the role of steroid‐free regimens in LT. Liver Transpl 14:512–525, 2008.
American Journal of Transplantation | 2005
Dorry L. Segev; Christopher E. Simpkins; Daniel S. Warren; K King; R. Sue Shirey; Warren R. Maley; J. Keith Melancon; Matthew Cooper; Tomasz Kozlowski; Robert A. Montgomery
Most successful protocols for renal transplantation across ABO incompatible (ABOi) barriers have utilized splenectomy as part of the pre‐conditioning process. We recently described successful ABOi transplantation using anti‐CD20 monoclonal antibody in lieu of splenectomy. In the current study, we hypothesized that plasmapheresis (PP) and low dose CMV hyper‐immunoglobulin (CMVIg) alone would be sufficient to achieve successful engraftment of ABOi kidneys. We describe four blood type incompatible patients who received live donor renal transplants from A1 (two patients), A2 (one patient), and B (one patient) donors. All patients started with antihuman globulin (AHG) phase titers of 64 or higher and were pre‐conditioned with PP/CMVIg but not splenectomy or anti‐CD20. All 4 patients underwent successful transplantation and have a mean current serum creatinine of 1.1 (range: 0.9–1.2). There were no episodes of antibody mediated rejection. Rapid allograft accommodation may limit the need for long‐term antibody suppression provided by splenectomy or anti‐CD20, thereby eliminating the added infectious risk of these modalities and removing another disincentive to ABOi transplantation.
Transplantation | 1998
Lloyd E. Ratner; Robert A. Montgomery; Warren R. Maley; Cynthia Cohen; James F. Burdick; Kenneth D. Chavin; Dilip S. Kittur; Paul M. Colombani; Andrew S. Klein; Edward S. Kraus; Louis R. Kavoussi
BACKGROUND Laparoscopic live donor nephrectomy offers advantages to the donor in terms of decreased pain and shorter recuperation. Heretofore no detailed analysis of the recipient of laparoscopically procured kidneys has been performed. The purpose of this study was to determine whether laparoscopic donor nephrectomy had any deleterious effect on the recipient. METHODS A retrospective review was conducted of all live donor renal transplantations performed from January 1995 through April 1998. The control group received kidneys procured via a standard flank approach (Open). Rejection was diagnosed histologically. Creatinine clearance was calculated using the Cockroft-Gault formula. RESULTS A total of 110 patients received kidneys from laparoscopic (Lap) and 48 from open donors. One-year recipient (100% vs. 97.0%) and graft (93.5% vs. 91.1%) survival rates were similar for the Open and Lap groups, respectively. A similar incidence of vascular thrombosis (3.4% vs. 2.1%, P=NS) and ureteral complications (9.1% vs. 6.3%, P=NS) were seen in the Lap and Open groups, respectively. The incidence of acute rejection for the first month was 30.1% for the Lap group and 31.9% for the Open group (P=NS). The rate of decline of serum creatinine level in the early posttransplantation period was initially greater in the Open group, but by postoperative day 4 no significant difference existed. No difference was observed in allograft function long-term. The median length of hospital stay was 7.0 days for both groups. CONCLUSIONS Laparoscopic live donor nephrectomy does not adversely effect recipient outcome. The previously demonstrated benefits to the donor, and the increased willingness of individuals to undergo live kidney donation, coupled with the acceptable outcomes experienced by recipients of laparoscopically procured kidneys justifies the continued development and adoption of this operation.
Transplantation | 2006
Kwang Woong Lee; Christopher E. Simpkins; Robert A. Montgomery; Jayme E. Locke; Dorry L. Segev; Warren R. Maley
Background. Liver transplantation from donation after cardiac death (DCD) donors is an increasingly common approach for expansion of the donor organ supply. However, transplantation with DCD livers results in inferior graft survival. In this study, we examined donor and recipient characteristics that are associated with poor allograft outcomes and present a set of criteria that permit allograft survival that is comparable to that of donation after brain death (DBD) grafts in both low- and high-risk recipients. Methods. The United Network for Organ Sharing/Organ Procurement and Transplantation Network Liver Transplantation Registry between January 1996 and March 2006 was investigated. Adult DCD liver transplants (n=874) were included. Results. A DCD risk index was developed using the statistically significant factors from a multivariate Cox model: history of previous transplantation, life support status at transplantation, donor age, donor warm ischemia time (DWIT), and cold ischemia time (CIT). Favorable DCD donor criteria were donor age ≤45 years, DWIT ≤15 min, and CIT ≤10 hr. Four risk groups were developed based upon index scores that showed different graft survival. Graft survival of the favorable DCD group (84.9% at 1 year, 75.2% at 3 years, and 69.4% at 5 years) was comparable to that for DBD liver transplantation irrespective of recipient condition. Increasing donor age was more highly predictive of poor outcomes in DCD compared to DBD, especially in recipients in poor preoperative condition. Conclusions. DCD livers from young donors with short DWIT and CIT should be given greater consideration in order to expand the number of available donor organs.
Transplantation | 2008
Jayme E. Locke; Daniel S. Warren; Andrew L. Singer; Dorry L. Segev; Christopher E. Simpkins; Warren R. Maley; Robert A. Montgomery; Gabriel M. Danovitch; Andrew M. Cameron
Background. When the United Network for Organ Sharing changed its algorithm for liver allocation to the model for end-stage liver disease (MELD) system in 2002, highest priority shifted to patients with renal insufficiency as a major component of their end-stage liver disease. An unintended consequence of the new system was a rapid increase in the number of simultaneous liver–kidney transplants (SLK) being performed yearly. Methods. Adult recipients of deceased donor liver transplants (LT, n=19,137), kidney transplants (n=33,712), and SLK transplants (n=1,032) between 1987 and 2006 were evaluated based on United Network for Organ Sharing data. Recipients were stratified by donor subgroup, MELD score, pre- versus post-MELD era, and length of time on dialysis. Matched-control analyses were performed, and graft and patient survival were analyzed by Kaplan–Meier and Cox proportional hazards analyses. Results. MELD era outcomes demonstrate a decline in patient survival after SLK. Using matched-control analysis, we are unable to demonstrate a benefit in the SLK cohort compared with LT, despite the fact that higher quality allografts are being used for SLK. Subgroup analysis of the SLK cohort did demonstrate an increase in overall 1-year patient and liver graft survival only in those patients on long-term dialysis (≥3 months) compared with LT (84.5% vs. 70.8%, P=0.008; hazards ratio 0.57 [95% CI 0.34, 0.95], P=0.03). Conclusion. These findings suggest that SLK may be overused in the MELD era and that current prioritization of kidney grafts to those liver failure patients results in wasting of limited resources.
The American Journal of Gastroenterology | 2001
Satheesh Nair; D B Cohen; C Cohen; H Tan; Warren R. Maley; Paul J. Thuluvath
OBJECTIVE:Severely obese patients who undergo orthotopic liver transplantation are likely to have higher morbidity, mortality, costs, and a lower long-term survival.METHODS:This case-control study was done at a university hospital. One hundred twenty-one consecutive patients who underwent liver transplantation between 1994 and 1996 were studied. Severe obesity was defined as body mass index (BMI) more than 95th percentile (>32.3 for women and >31.1 for men), and moderate obesity was defined as BMI between 27.3 and 32.3 for women and 27.8 and 31.1 for men. The outcome measures were intraoperative complications, postoperative complications (wound infections, bile leak, vascular complications), length of hospital stay, costs of transplantation, and long-term survivalRESULTS:The baseline characteristics, UNOS status, and cause of liver disease at the time of transplantation were similar in severely obese (n = 21, BMI = 37.4 ± 4.8 kg/m2), obese (n = 36, BMI 28.7 ± 0.9 kg/m2), and nonobese patients (n = 64, BMI 23.8 ± 2.5 kg/m2). The intraoperative complications and transfusion requirements were similar in all three groups. The postoperative complications such as respiratory failure (p = 0.009) and systemic vascular complications (p = 0.04) were significantly higher in severely obese patients. The overall perioperative complication rate was 0.61 (39 of 64 patients) in nonobese patients, 0.77 (28 of 36 patients) in obese patients, and 1.43 (30 of 21 patients) in severely obese patients (p = 0.01). Infections were the leading cause of death in all groups accounting for 57–66% of deaths. The length of hospital stay was significantly higher in obese patients. The hospital costs of transplantation were higher (