Abhinav Humar
University of Pittsburgh
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Abhinav Humar.
Annals of Surgery | 2001
David E. R. Sutherland; Rainer W. G. Gruessner; David L. Dunn; Arthur J. Matas; Abhinav Humar; Raja Kandaswamy; S. M. Mauer; William R. Kennedy; Frederick C. Goetz; R. P. Robertson; Angelika C. Gruessner; Najarian Js
ObjectiveTo determine outcome in diabetic pancreas transplant recipients according to risk factors and the surgical techniques and immunosuppressive protocols that evolved during a 33-year period at a single institution. Summary Background DataInsulin-dependent diabetes mellitus is associated with a high incidence of management problems and secondary complications. Clinical pancreas transplantation began at the University of Minnesota in 1966, initially with a high failure rate, but outcome improved in parallel with other organ transplants. The authors retrospectively analyzed the factors associated with the increased success rate of pancreas transplants. MethodsFrom December 16, 1966, to March 31, 2000, the authors performed 1,194 pancreas transplants (111 from living donors; 191 retransplants): 498 simultaneous pancreas–kidney (SPK) and 1 simultaneous pancreas–liver transplant; 404 pancreas after kidney (PAK) transplants; and 291 pancreas transplants alone (PTA). The analyses were divided into five eras: era 0, 1966 to 1973 (n = 14), historical; era 1, 1978 to 1986 (n = 148), transition to cyclosporine for immunosuppression, multiple duct management techniques, and only solitary (PAK and PTA) transplants; era 2, 1986 to 1994 (n = 461), all categories (SPK, PAK, and PTA), predominately bladder drainage for graft duct management, and primarily triple therapy (cyclosporine, azathioprine, and prednisone) for maintenance immunosuppression; era 3, 1994 to 1998 (n = 286), tacrolimus and mycophenolate mofetil used; and era 4, 1998 to 2000 (n = 275), use of daclizumab for induction immunosuppression, primarily enteric drainage for SPK transplants, pretransplant immunosuppression in candidates awaiting PTA. ResultsPatient and primary cadaver pancreas graft functional (insulin-independence) survival rates at 1 year by category and era were as follows: SPK, era 2 (n = 214) versus eras 3 and 4 combined (n = 212), 85% and 64% versus 92% and 79%, respectively; PAK, era 1 (n = 36) versus 2 (n = 61) versus 3 (n = 84) versus 4 (n = 92), 86% and 17%, 98% and 59%, 98% and 76%, and 98% and 81%, respectively; in PTA, era 1 (n = 36) versus 2 (n = 72) versus 3 (n = 30) versus 4 (n = 40), 77% and 31%, 99% and 50%, 90% and 67%, and 100% and 88%, respectively. In eras 3 and 4 combined for primary cadaver SPK transplants, pancreas graft survival rates were significantly higher with bladder drainage (n = 136) than enteric drainage (n = 70), 82% versus 74% at 1 year (P = .03). Increasing recipient age had an adverse effect on outcome only in SPK recipients. Vascular disease was common (in eras 3 and 4, 27% of SPK recipients had a pretransplant myocardial infarction and 40% had a coronary artery bypass); those with no vascular disease had significantly higher patient and graft survival rates in the SPK and PAK categories. Living donor segmental pancreas transplants were associated with higher technically successful graft survival rates in each era, predominately solitary (PAK and PTA) in eras 1 and 2 and SPK in eras 3 and 4. Diabetic secondary complications were ameliorated in some recipients, and quality of life studies showed significant gains after the transplant in all recipient categories. ConclusionsPatient and graft survival rates have significantly improved over time as surgical techniques and immunosuppressive protocols have evolved. Eventually, islet transplants will replace pancreas transplants for suitable candidates, but currently pancreas transplants can be applied and should be an option at all stages of diabetes. Early transplants are preferable for labile diabetes, but even patients with advanced complications can benefit.
Transplantation | 1999
Eric Johnson; J. Kyle Anderson; Cheryl L. Jacobs; Gina Suh; Abhinav Humar; Benjamin D. Suhr; Stephen R. Kerr; Arthur J. Matas
The University of Minnesota has been a strong advocate of living donor kidney transplants. The benefits for living donor recipients have been well documented. The relative low risk of physical complications during donation has also been well documented. Less well understood is the psychosocial risk to donors. Most published reports have indicated an improved sense of well-being and a boost in self-esteem for living kidney donors. However, there have been some reports of depression and disrupted family relationships after donation, even suicide after a recipients death. To determine the quality of life of our donors, we sent a questionnaire to 979 who had donated a kidney between August 1, 1984, and December 31, 1996. Of the 60% who responded, the vast majority had an excellent quality of life. As a group, they scored higher than the national norm on the SF-36, a standardized quality of life health questionnaire. However, 4% were dissatisfied and regretted the decision to donate. Further, 4% found the experience extremely stressful and 8% very stressful. We used multivariate analysis to identify risk factors for this poor psychosocial outcome and found that relatives other than first degree (odds ratio=3.5, P=0.06) and donors whose recipient died within 1 year of transplant (odds ratio=3.3, P=0.014) were more likely to say they would not donate again if it were possible. Further, donors who had perioperative complications (odds ratio=3.5, P=0.007) and female donors (odds ratio=1.8, P=0.1) were more likely to find the overall experience more stressful. Overall, the results of this study are overwhelmingly positive and have encouraged us to continue living donor kidney transplants.
Annals of Surgery | 2009
Kareem Abu-Elmagd; Guilherme Costa; Geoffrey Bond; Kyle Soltys; Rakesh Sindhi; Tong Wu; Darlene Koritsky; Bonita Schuster; L Martin; Ruy J. Cruz; Noriko Murase; Adriana Zeevi; William Irish; Maher O. Ayyash; Laura E. Matarese; Abhinav Humar; George V. Mazariegos
Objective:To assess the evolution of visceral transplantation in the milieu of surgical technical modifications, new immunosuppressive protocols, and other management strategies. Summary Background Data:With the clinical feasibility of intestinal and multivisceral transplantation in 1990, multifaceted innovative tactics were required to improve outcome and increase procedural practicality. Methods:Divided into 3 eras, 453 patients received 500 visceral transplants. The primary used immunosuppression was tacrolimus-steroid-only during Era I (5/90–5/94), adjunct induction with multiple drug therapy during Era II (1/95–6/01), and recipient pretreatment with tacrolimus monotherapy during Era III (7/01–11/08). During Era II/III, donor bone marrow was given (n = 79), intestine was ex vivo irradiated (n = 44), and Epstein-Barr-Virus (EBV)/cytomegalovirus (CMV) loads were monitored. Results:Actuarial patient survival was 85% at 1-year, 61% at 5-years, 42% at 10-years, and 35% at 15-years with respective graft survival of 80%, 50%, 33%, and 29%. With a 10% retransplantation rate, second/third graft survival was 69% at 1-year and 47% at 5-years. The best outcome was with intestine-liver allografts. Era III rabbit antithymocyte globulin or alemtuzumab pretreatment-based strategy was associated with significant (P < 0.0001) improvement in outcome with 1- and 5-year patient survival of 92% and 70%. Conclusion:Survival has greatly improved over time as management strategies evolved. The current results clearly justify elevating the procedure level to that of other abdominal organs with the privilege to permanently reside in a respected place in the surgical armamentarium. Meanwhile, innovative tactics are still required to conquer long-term hazards of chronic rejection of liver-free allografts and infection of multivisceral recipients.
Annals of Surgery | 2000
Abhinav Humar; Raja Kandaswamy; Darla K. Granger; Rainer W. G. Gruessner; Angelika C. Gruessner; David E. R. Sutherland
OBJECTIVE To document the decreased incidence of surgical complications after pancreas transplantation in recent times. SUMMARY BACKGROUND DATA Compared with other abdominal transplants, pancreas transplants have historically had the highest incidence of surgical complications. However, over the past few years, the authors have noted a significant decrease in the incidence of surgical complications. METHODS The authors studied the incidence of early (<3 months after transplant) surgical complications (e.g., relaparotomy, thrombosis, infections, leaks) after 580 pancreas transplants performed during a 12-year period. Patients were analyzed and compared in two time groups: era 1 (June 1, 1985, to April 30, 1994, n = 367) and era 2 (May 1, 1994, to June 30, 1997, n = 213). RESULTS Overall, surgical complications were significantly reduced in era 2 compared with era 1. The relaparotomy rate decreased from 32.4% in era 1 to 18.8% in era 2. Significant risk factors for early relaparotomy were donor age older than 40 years and recipient obesity. Recipients with relaparotomy had significantly lower graft survival rates than those without relaparotomy, but patient survival rates were not significantly different. A major factor contributing to the lower relaparotomy rate in era 2 was a significant decrease in the incidence of graft thrombosis; the authors believe this lower incidence is due to the routine use of postoperative low-dose intravenous heparin and acetylsalicylic acid. The incidence of bleeding requiring relaparotomy did not differ between the two eras. Older donor age was the most significant risk factor for graft thrombosis. The incidence of intraabdominal infections significantly decreased between the two eras; this decrease may be due to improved prophylaxis regimens in the first postoperative week. CONCLUSIONS Although a retrospective study has its limits, the results of this study, the largest single-center experience to date, show a significant decrease in the surgical risk associated with pancreas transplants. Reasons for this decrease are identification of donor and recipient risk factors, better prophylaxis regimens, refinements in surgical technique, and improved immunosuppressive regimens. These improved results suggest that more widespread application of pancreas transplantation is warranted.
Transplantation | 2004
Abhinav Humar; Thigarajan Ramcharan; Raja Kandaswamy; Rainer W. G. Gruessner; Angelika C. Gruessner; David E. R. Sutherland
Background. Technical failure (TF) rates remain high after pancreas transplants; while rates have decreased over the last decade, more than 10% of all pancreas grafts continue to be lost due to technical reasons. We performed a multivariate analysis to determine causes and risk factors for TF of pancreas grafts. Results. Between 1994 and 2003, 937 pancreas transplants were performed at our center in the following transplant categories: simultaneous pancreas-kidney (SPK) (n=327), pancreas after kidney (PAK) (n=399), and pancreas transplant alone (PTA) (n=211). Of these, 123 (13.1%) grafts were lost due to technical reasons (thrombosis, leaks, infections). TF rates were higher for SPK (15.3%) versus PAK (12.2%) or PTA (11.4%), though this was not statistically significant. Thrombosis accounted for 52.0% of all TFs. Other causes were infections (18.7%), pancreatitis (20.3%), leaks (6.5%), and bleeding (2.4%). Thrombosis was the most common cause for TF in all three transplant categories. By multivariate analysis, the following were significant risk factors for TF of the graft: recipient body mass index (BMI) >30 kg/m2 (relative risk [RR]=2.42, P=0.0003), preservation time >24 hr (1.87, P=0.04), cause of donor death other than trauma (RR=1.58, P=0.04), enteric versus bladder drainage (1.68, P=0.06), and donor BMI >30 kg/m2 (1.66, P=0.06). Not significant were donor or recipient age, a retransplant, and the category of transplant. Conclusions. TFs remain significant after pancreas transplants. In SPK recipients, TF represents the most common cause of pancreas graft loss. For isolated pancreas transplants, TF is second only to rejection as a cause of graft loss. Increased preservation times and donor or recipient obesity seem to be risk factors. Minimizing these risks factors would be important to try to decrease TF.
Transplantation | 2001
Abhinav Humar; Thiagarajan Ramcharan; Roger Denny; Kristen J. Gillingham; William D. Payne; Arthur J. Matas
Background. The most common surgical complication after a kidney transplant is likely related to the wound. The purpose of this analysis was to determine the incidence of, and risk factors for, wound complications (e.g., infections, hernias) in kidney recipients and to assess whether newer immunosuppressive drugs increase the risk for such complications. Methods. Between January 1, 1984 and September 30, 1998, we performed 2013 adult kidney transplants. Of these 2013 recipients, 97 (4.8%) developed either a superficial or a deep wound infection. Additionally, 73 (3.6%) recipients developed either a fascial dehiscence or a hernia of the wound. We used univariate and multivariate techniques to determine significant risk factors and outcomes. Results. Mean time to development of a superficial infection (defined as located above the fascia) was 11.9 days posttransplant; to development of a deep infection (defined as located below the fascia), 39.2 days; and to development of a hernia or fascial dehiscence, 12.8 months. By multivariate analysis, the most significant risk factor for a superficial or deep wound infection was obesity (defined as body mass index>30 kg/m2) (RR=4.4, P =0.0001). Other significant risk factors were a urine leak posttransplant, any reoperation through the transplant incision, diabetes, and the use of mycophenolate mofetil (MMF) (vs. azathioprine) for maintenance immunosuppression (RR=2.43, P =0.0001). Significant risk factors for a hernia or fascial dehiscence were any reoperation through the transplant incision, increased recipient age, obesity, and the use of MMF (vs. azathioprine) for maintenance immunosuppression (RR=3.54, P =0.0004). Use of antibody induction and treatment for acute rejection were not significant risk factors for either infections or hernias. Death-censored graft survival was lower in recipients who developed a wound infection (vs. those who did not); it was not lower in recipients who developed an incisional hernia or facial dehiscence (vs. those who did not). Conclusions. Despite immunosuppression including chronic steroids, the incidence of wound infections, incisional hernias, and fascial dehiscence is low in kidney recipients. As with other types of surgery, the main risk factors for postoperative complications are obesity, reoperation, and increased age. However, in kidney recipients, use of MMF (vs. azathioprine) is an additional risk factor –one that potentially could be altered, especially in high-risk recipients.
American Journal of Transplantation | 2005
Arthur J. Matas; Raja Kandaswamy; Kristen J. Gillingham; Lois McHugh; Hassan N. Ibrahim; Bertram L. Kasiske; Abhinav Humar
Concern persists that prednisone‐free maintenance immunosuppression in kidney transplant recipients will be associated with an increase in late allograft dysfunction and graft loss. We herein report 5‐year follow‐up of a trial of prednisone‐free maintenance immunosuppression. From October 1, 1999, through January 31, 2005, at our center, 589 kidney transplant recipients were treated with a protocol incorporating discontinuation of their prednisone on postoperative day 6. At 5 years, actuarial patient survival was 91%; graft survival, 84%; death‐censored graft survival, 92%; acute rejection‐free graft survival, 84% and chronic rejection‐free graft survival, 87%. The mean serum creatinine level (±SD) at 1 year was 1.6 ± 0.6; at 5 years, 1.7 ± 0.8. In all, 86% of kidney recipients with functioning grafts remain prednisone‐free as of April 30, 2005.
Transplantation | 1999
Abhinav Humar; Kristen J. Gillingham; William D. Payne; David L. Dunn; David E. R. Sutherland; Arthur J. Matas
BACKGROUND It has long been suggested that cytomegalovirus (CMV) disease plays a role in the pathogenesis of chronic rejection (CR). However, its role has been difficult to prove, given the strong association between acute rejection and CMV, and the even stronger association between acute rejection and CR. To try to isolate the relative contribution of CMV infection in the pathogenesis of CR, we used multivariate techniques to examine risk factors for CR, including CMV disease. METHODS Our study population consisted of adult recipients of a first kidney graft who underwent transplantation at a single center between 1/1/85 and 6/30/97 (n = 1339). RESULTS Multivariate analysis using time to CR as the dependent variable demonstrated acute rejection to be the strongest risk factor (relative risk [RR] = 17.8, P = 0.0001), followed by older donor age (RR = 1.46, P = 0.01). The presence of CMV disease showed a trend toward increased risk for CR (RR = 1.30, P = 0.10), although the association was not as strong as with the other two variables. Comparing only those recipients with acute rejection and CMV disease versus those with acute rejection but no CMV disease, the relative risk of developing CR was 1.37 times higher in the former group. Recipients with acute rejection and CMV developed CR sooner and with a higher incidence versus those with acute rejection but no CMV (P = 0.002). It is interesting, however, that CMV disease was only a risk factor for CR in the presence of acute rejection. Recipients with no acute rejection and CMV disease did not have a higher incidence of CR versus those with no acute rejection and no CMV (P = NS). CONCLUSION CMV disease seems to play some role in the pathogenesis of CR but only in the presence of acute rejection. Reasons may include (i) the inability to adequately treat acute rejection due to the presence of CMV disease or (ii) the increased virulence of latent CMV virus in recipients being treated for acute rejection. Our data may suggest a role for more aggressive prophylaxis against CMV disease, especially at the time of treatment for acute rejection.
Clinical Transplantation | 2002
Abhinav Humar; Thiagarajan Ramcharan; Raja Kandaswamy; K. J. Gillingham; William D. Payne; Arthur J. Matas
Humar A, Ramcharan T, Kandaswamy R, Gillingham K, Payne WD, Matas AJ. Risk factors for slow graft function after kidney transplants: a multivariate analysis. Clin Transplant 2002: 16: 425–429.
Seminars in Dialysis | 2005
Abhinav Humar; Arthur J. Matas
Kidney transplants have become common surgical procedures, with thousands performed yearly around the world. The surgical techniques for the transplant are well established and the procedure is associated with high success rates. The complication rate associated with the procedure is low, especially when compared to other abdominal organ transplants such as liver and pancreas transplants. Nonetheless, the detection, accurate diagnosis, and timely management of surgical complications occurring after kidney transplant are important tasks of the team managing these patients. A delay in the diagnosis or management of these complications can result in significant morbidity to the recipient, with risk of graft loss and mortality. Most surgical complications involve either the wound or one of the three anastomoses (renal artery, renal vein, or ureter). Examples include wound infection, renal artery or vein thrombosis, and urine leak. Most of these complications will require surgical or radiologic intervention for appropriate management.