Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter L. Abt is active.

Publication


Featured researches published by Peter L. Abt.


Annals of Surgery | 2004

Survival following liver transplantation from non-heart-beating donors

Peter L. Abt; Niraj M. Desai; Michael D. Crawford; Lisa M. Forman; Joseph W. Markmann; Kim M. Olthoff; James F. Markmann

Objective:To determine whether patient and graft survival following transplantation with non-heart-beating donor (NHBD) hepatic allografts is equivalent to heart-beating-donor (HBD) allografts. Summary Background Data:With the growing disparity between the number of patients awaiting liver transplantation and a limited supply of cadaveric organs, there is renewed interest in the use of hepatic allografts from NHBDs. Limited outcome data addressing this issue exist. Methods:Retrospective evaluation of graft and patient survival among adult recipients of NHBD hepatic allografts compared with recipients of HBD livers between 1993 and 2001 using the United Network of Organ Sharing database. Results:NHBD (N = 144) graft survival was significantly shorter than HBD grafts (N = 26,856). One- and 3-year graft survival was 70.2% and 63.3% for NHBD recipients versus 80.4% and 72.1% (P = 0.003 and P = 0.012) for HBD recipients. Recipients of an NHBD graft had a greater incidence of primary nonfunction (11.8 vs. 6.4%, P = 0.008) and retransplantation (13.9% vs. 8.3%, P = 0.04) compared with HBD recipients. Prolonged cold ischemic time and recipient life support were predictors of early graft failure among recipients of NHBD livers. Although differences in patient survival following NHBD versus HBD transplant did not meet statistical significance, a strong trend was evident that likely has relevant clinical implications. Conclusions:Graft and patient survival is inferior among recipients of NHBD livers. NHBD donors remain an important source of hepatic grafts; however, judicious use is warranted, including minimization of cold ischemia and use in stable recipients.


Transplantation | 2003

Liver transplantation from controlled non-heart-beating donors: an increased incidence of biliary complications.

Peter L. Abt; Michael J. Crawford; Niraj M. Desai; James F. Markmann; Kim M. Olthoff; Abraham Shaked

Background. Hepatic allografts from non–heart-beating donors (NHBD) have been cited as a means to expand the supply of donor livers. Concern exists that donor warm ischemic time in addition to subsequent cold ischemia-reperfusion injury may result in damage to sensitive cell populations within the liver. Because the biliary epithelium is sensitive to ischemia-reperfusion injury, the authors surmised that an increased incidence of biliary complications might occur among recipients of an NHBD allograft. Methods. This study was a retrospective evaluation of NHBD recipients compared to a group of heart-beating donor (HBD) recipients from a single institution. Results. Fifteen patients received a hepatic allograft from a controlled NHBD donor. NHBD and HBD (n=221) graft survival did not differ at 1 (71.8% vs. 85.4%, P =0.23) or 3 years (71.8% vs. 73.9%, P =0.68). Patient survival at 1 (79% vs. 90.9%, P =0.16) and 3 years (79.0% vs. 77.7%, P =0.8) was also similar. Major biliary complications occurred in five (33.3%) NHBD recipients; 66.6% of the NHBD biliary complications consisted of intrahepatic strictures versus 19.2% among HBD recipients (P <0.01). Major biliary complications in the NHBD recipients resulted in multiple interventional procedures, retransplantation, and death. Conclusions. Donor warm ischemic time may predispose hepatic allografts to an increased incidence of ischemic type strictures. Although graft and patient survival was similar to a cohort of HBD recipients, caution is urged with the use of these organs.


American Journal of Transplantation | 2009

ASTS recommended practice guidelines for controlled donation after cardiac death organ procurement and transplantation

David J. Reich; David C. Mulligan; Peter L. Abt; Timothy L. Pruett; Michael Abecassis; Anthony M. D'Alessandro; Elizabeth A. Pomfret; Richard B. Freeman; James F. Markmann; Douglas W. Hanto; Arthur J. Matas; John P. Roberts; Robert M. Merion; Goran B. Klintmalm

The American Society of Transplant Surgeons (ASTS) champions efforts to increase organ donation. Controlled donation after cardiac death (DCD) offers the family and the patient with a hopeless prognosis the option to donate when brain death criteria will not be met. Although DCD is increasing, this endeavor is still in the midst of development. DCD protocols, recovery techniques and organ acceptance criteria vary among organ procurement organizations and transplant centers. Growing enthusiasm for DCD has been tempered by the decreased yield of transplantable organs and less favorable posttransplant outcomes compared with donation after brain death. Logistics and ethics relevant to DCD engender discussion and debate among lay and medical communities. Regulatory oversight of the mandate to increase DCD and a recent lawsuit involving professional behavior during an attempted DCD have fueled scrutiny of this activity. Within this setting, the ASTS Council sought best‐practice guidelines for controlled DCD organ donation and transplantation. The proposed guidelines are evidence based when possible. They cover many aspects of DCD kidney, liver and pancreas transplantation, including donor characteristics, consent, withdrawal of ventilatory support, operative technique, ischemia times, machine perfusion, recipient considerations and biliary issues. DCD organ transplantation involves unique challenges that these recommendations seek to address.


Transplantation | 2004

Predicting outcome after liver transplantation: utility of the model for end-stage liver disease and a newly derived discrimination function1

Niraj M. Desai; Kevin C. Mange; Michael D. Crawford; Peter L. Abt; Adam Frank; Joseph W. Markmann; Ergun Velidedeoglu; William C. Chapman; James F. Markmann

Background. The Model for End-Stage Liver Disease (MELD) has been found to accurately predict pretransplant mortality and is a valuable system for ranking patients in greatest need of liver transplantation. It is unknown whether a higher MELD score also predicts decreased posttransplant survival. Methods. We examined a cohort of patients from the United Network for Organ Sharing (UNOS) database for whom the critical pretransplant recipient values needed to calculate the MELD score were available (international normalized ratio of prothrombin time, total bilirubin, and creatinine). In these 2,565 patients, we analyzed whether the MELD score predicted graft and patient survival and length of posttransplant hospitalization. Results. In contrast with its ability to predict survival in patients with chronic liver disease awaiting liver transplant, the MELD score was found to be poor at predicting posttransplant outcome except for patients with the highest 20% of MELD scores. We developed a model with four variables not included in MELD that had greater ability to predict 3-month posttransplant patient survival, with a c-statistic of 0.65, compared with 0.54 for the pretransplant MELD score. These pretransplant variables were recipient age, mechanical ventilation, dialysis, and retransplantation. Recipients with any two of the three latter variables showed a markedly diminished posttransplant survival rate. Conclusions. The MELD score is a relatively poor predictor of posttransplant outcome. In contrast, a model based on four pretransplant variables (recipient age, mechanical ventilation, dialysis, and retransplantation) had a better ability to predict outcome. Our results support the use of MELD for liver allocation and indicate that statistical modeling, such as reported in this article, can be used to identify futile cases in which expected outcome is too poor to justify transplantation.


American Journal of Transplantation | 2006

Nephrogenic Systemic Fibrosis Among Liver Transplant Recipients: A Single Institution Experience and Topic Update

Manoj Maloo; Peter L. Abt; Randeep Kashyap; D. Younan; Martin S. Zand; Mark S. Orloff; A. Jain; A. Pentland; G. Scott; Adel Bozorgzadeh

Nephrogenic systemic fibrosis (NSF) is a recently characterized systemic fibrosing disorder developing in the setting of renal insufficiency. NSFs rapidly progressive nature resulting in disability within weeks of onset makes early diagnosis important. Two reports of NSF after liver transplantation are known of. We present three cases of NSF developing within a few months after liver transplantation and review the current literature. Loss of regulatory control of the circulating fibrocyte, its aberrant recruitment, in a milieu of renal failure and a recent vascular procedure appear important in its development. Known current therapies lack consistent efficacy. Only an improvement in renal function has the greatest likelihood of NSFs resolution. Delayed recognition may pose a significant barrier to functional recovery in the ubiquitously deconditioned liver transplant patient. Early recognition and implementation of aggressive physical therapy appear to have the greatest impact on halting its progression.


American Journal of Transplantation | 2004

Allograft Survival Following Adult‐to‐Adult Living Donor Liver Transplantation

Peter L. Abt; Kevin C. Mange; Kim M. Olthoff; James F. Markmann; K. Rajender Reddy; Abraham Shaked

Adult‐to‐adult living donor liver transplantation (AALDLT) is emerging as a method to treat patients with end‐stage liver disease. The aims of this study were to identify donor and recipient characteristics of AALDLT, to determine variables that affect allograft survival, and to examine outcomes compared with those achieved following cadaveric transplantation. Cox proportional hazards models were fit to examine characteristics associated with the survival of AALDLT. Survival of AALDLT was then compared with cadaveric allografts in multivariable Cox models. Older donor age (>44 years), female‐to‐male donor to recipient relationship, recipient race, and the recipient medical condition before transplant were factors related to allograft failure among 731 AALDLT. Despite favorable donor and recipient characteristics, the rate of allograft failure, specifically the need for retransplantation, was increased among AALDLT (hazard ratio 1.66, 95% C.I. = 1.30–2.11) compared with cadaveric recipients. In conclusion, among AALDLT recipients, selecting younger donors, placing the allografts in recipients who have not had a prior transplant and are not in the ICU, may enhance allograft survival. Analysis of this early experience with AALDLT suggests that allograft failure may be higher than among recipients of a cadaveric liver.


Liver Transplantation | 2012

Increasing disparity in waitlist mortality rates with increased model for end‐stage liver disease scores for candidates with hepatocellular carcinoma versus candidates without hepatocellular carcinoma

David J. Goldberg; Benjamin French; Peter L. Abt; Sandy Feng; Andrew M. Cameron

Candidates with hepatocellular carcinoma (HCC) within the Milan criteria (MC) receive standardized Model for End‐Stage LIver Disease (MELD) exception points because of the projected risk of tumor expansion beyond the MC. Exception points at listing are meant to be equivalent to a 15% rusj if 90‐day mortality, with additional points granted every 3 months, equivalent to a 10% increased morality risk. We analyzed the United Network for Organ Sharing database (January 1, 2005 to May 31, 2009) to compare the 90‐day waitlist outcomes of HCC candidates and non‐HCC candidates with similar MELD scores. Two hundred fifty‐nine HCC candidates (4.1%) who were initially listed with 22 MELD exception points were removed because of death or clinical deterioration within 90 days of listing, whereas 283 non‐HCC candidates (11.0%) with initial laboratory MELD scores of 21 to 23 were removed. Ninety‐three HCC candidates (4.6%) with 25 exception points (after 3‐6 months of waiting) were removed because of death or clinical deterioration within 90 days, whereas 805 non‐HCC candidates (17.3%) with laboratory MELD scores of 24 to 26 were removed. Twenty HCC candidates (3.0%) with 28 exception points (after 6‐9 months of waiting) were removed for death or clinical deterioration within 90 days, whereas 646 non‐HCC candidates (23.6%) with laboratory MELD scores of 27 to 29 were removed. In multivariate logistic regression models, HCC candidates had significantly lower 90‐day odds of waitlist removal for death or clinical deterioration (P < 0.001). Over time, the risk of waitlist removal for death or clinical deterioration was unchanged for HCC candidates (P = 0.17), whereas it increased significantly for non‐HCC candidates. The current allotment of HCC exception points should be re‐evaluated because of the stable risk of waitlist dropout for these candidates. Liver Transpl 18:434–443, 2012.


Transplantation | 2004

Factors differentially correlated with the outcome of liver transplantation in hcv+ and HCV- recipients.

Ergun Velidedeoglu; Kevin C. Mange; Adam Frank; Peter L. Abt; Niraj M. Desai; Joseph W. Markmann; Rajender Reddy; James F. Markmann

Background. Survival following liver transplantation for hepatitis C virus (HCV) is significantly poorer than for liver transplants performed for other causes of chronic liver disease. The factors responsible for the inferior outcome in HCV+ recipients, and whether they differ from factors associated with survival in HCV- recipients, are unknown. Methods. The UNOS database was analyzed to identify factors associated with outcome in HCV+ and HCV- recipients. Kaplan-Meier graft and patient survival and Cox proportional hazards analysis were conducted on 13,026 liver transplants to identify the variables that were differentially associated with outcome survival in HCV- and HCV+ recipients. Results. Of the 13,026 recipients, 7386 (56.7%) were HCV- and 5640 were HCV+. In HCV- and HCV+ recipient populations, five-year patient survival rates were 83.5% vs. 74.6% (P<0.00001) and five-year graft survival rates 80.6% vs. 69.9% (P<0.00001), respectively. In a multivariate regression model, donor age and recipient creatinine were observed to be significant covariates in both groups, while donor race, cold ischemia time (CIT), female to male transplants, and recipient albumin were independent predictors of survival of HCV- recipients. In the HCV+ cohort, recipient race, warm ischemia time (WIT), and diabetes also independently predicted graft survival. Conclusions. A number of parameters are differentially correlated with outcome in HCV- and HCV+ recipients of orthotopic liver transplantion. These findings may not only have practical implications in the selection and management of liver transplant patients, but also may shed new insight into the biology of HCV infection posttransplant.


Annals of Surgery | 2004

Transplantation for Type I Diabetes: Comparison of Vascularized Whole-Organ Pancreas With Isolated Pancreatic Islets

Adam M. Frank; Shaoping Deng; Xiaolun Huang; Ergun Velidedeoglu; Yong-Suk Bae; Chengyang Liu; Peter L. Abt; Robert Stephenson; Muhammad Mohiuddin; Thav Thambipillai; Eileen Markmann; Maral Palanjian; Marty T. Sellers; Ali Naji; Clyde F. Barker; James F. Markmann

Objective:We sought to compare the efficacy, risks, and costs of whole-organ pancreas transplantation (WOP) with the costs of isolated islet transplantation (IIT) in the treatment of patients with type I diabetes mellitus. Summary Background Data:A striking improvement has taken place in the results of IIT with regard to attaining normoglycemia and insulin independence of type I diabetic recipients. Theoretically, this minimally invasive therapy should replace WOP because its risks and expense should be less. To date, however, no systematic comparisons of these 2 options have been reported. Methods:We conducted a retrospective analysis of a consecutive series of WOP and IIT performed at the University of Pennsylvania between September 2001 and February 2004. We compared a variety of parameters, including patient and graft survival, degree and duration of glucose homeostasis, procedural and immunosuppressive complications, and resources utilization. Results:Both WOP and IIT proved highly successful at establishing insulin independence in type I diabetic patients. Whole-organ pancreas recipients experienced longer lengths of stay, more readmissions, and more complications, but they exhibited a more durable state of normoglycemia with greater insulin reserves. Achieving insulin independence by IIT proved surprisingly more expensive, despite shorter initial hospital and readmission stays. Conclusion:Despite recent improvement in the success of IIT, WOP provides a more reliable and durable restoration of normoglycemia. Although IIT was associated with less procedure-related morbidity and shorter hospital stays, we unexpectedly found IIT to be more costly than WOP. This was largely due to IIT requiring islets from multiple donors to gain insulin independence. Because donor pancreata that are unsuitable for WOP can often be used successfully for IIT, we suggest that as IIT evolves, it should continue to be evaluated as a complementary alternative to rather than as a replacement for the better-established method of WOP.


American Journal of Transplantation | 2007

Living-Donor Liver Transplantation in the United States: Identifying Donors at Risk for Perioperative Complications

Siddharth A. Patel; Mark S. Orloff; Georgious Tsoulfas; Randeep Kashyap; Ashokkumar Jain; Adel Bozorgzadeh; Peter L. Abt

Donor safety has been scrutinized by both the medical community and the media. Variability exists in reported donor complications and associated risk factors are ill defined. Use of administrative data can overcome the bias of single‐center studies and explore variables associated with untoward events. A retrospective cohort study identifying living liver donors in two large healthcare registries yielded 433 right and left lobe donors from 13 centers between 2001 and 2005. Perioperative complications were identified using International Classification of Diseases, 9th Revision (ICD‐9) coding data and classified according to the Clavien system. Logistic regression models identified factors associated with complications. There was one perioperative death (0.23%). The overall complication rate was 29.1% and major complication rate defined by a Clavien grade ≥3 was 3.5%. Center living‐donor volume (OR = 0.97, 95% CI = 0.95–0.99) and the ratio of living‐donors to all donors (living and deceased) (OR = 0.94, 95% CI = 0.92–0.96) were associated with a lower risk of all complications. Donor age >50 years (OR = 4.25, 95% CI = 1.22–14.87) was associated with a higher risk of major complications. Living liver donation is currently performed with a low risk of major morbidity. Use of administrative data represents an important tool to facilitate a better understanding of donor risk factors.

Collaboration


Dive into the Peter L. Abt's collaboration.

Top Co-Authors

Avatar

Matthew H. Levine

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Peter P. Reese

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

David S. Goldberg

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Kim M. Olthoff

University of California

View shared research outputs
Top Co-Authors

Avatar

Roy D. Bloom

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Mark S. Orloff

University of Rochester Medical Center

View shared research outputs
Top Co-Authors

Avatar

Abraham Shaked

University of California

View shared research outputs
Top Co-Authors

Avatar

Adel Bozorgzadeh

University of Massachusetts Medical School

View shared research outputs
Top Co-Authors

Avatar

Ali Naji

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Randeep Kashyap

University of Rochester Medical Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge