Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John G. Lunz is active.

Publication


Featured researches published by John G. Lunz.


Journal of Heart and Lung Transplantation | 2013

Persistent strong anti-HLA antibody at high titer is complement binding and associated with increased risk of antibody-mediated rejection in heart transplant recipients

Adriana Zeevi; John G. Lunz; Brian Feingold; M.A. Shullo; C. Bermudez; J.J. Teuteberg; Steven A. Webber

BACKGROUND Sensitized heart transplant candidates are evaluated for donor-specific anti-HLA IgG antibody (DSA) by Luminex single-antigen bead (SAB) testing (SAB-IgG) to determine donor suitability and help predict a positive complement-dependent cytotoxicity crossmatch (CDC-XM) by virtual crossmatching (VXM). However, SAB testing used for VXM does not correlate perfectly with CDC-XM results and individual transplant programs have center-specific permissible thresholds to predict crossmatch positivity. A novel Luminex SAB-based assay detecting C1q-binding HLA antibodies (SAB-C1q) contributes functional information to SAB testing, but the relationship between SAB strength and complement-binding ability is unclear. METHODS In this retrospective study, we identified 15 pediatric and adult heart allograft candidates with calculated panel-reactive antibody (cPRA) >50% by SAB-IgG and compared conventional SAB-IgG results with SAB-C1q testing. RESULTS Pre- and post-transplant DSA by SAB-C1q correlated with DSA by SAB-IgG and also with CDC-XM results and early post-transplant endomyocardial biopsy findings. Individual HLA antibodies by SAB-IgG in undiluted sera correlated poorly with SAB-C1q; however, when sera were diluted 1:16, SAB-IgG results were well correlated with SAB-C1q. In some sera, HLA antibodies with low mean fluorescent intensity (MFI) by SAB-IgG exhibited high SAB-C1q MFIs for the same HLA antigens. Diluting or heat-treating these sera increased SAB-IgG MFI, consistent with SAB-C1q results. In 13 recipients, SAB-C1q-positive DSA was associated with positive CDC-XM and with early clinical post-transplant antibody-mediated rejection (cAMR). CONCLUSIONS Risk assessment for positive CDC-XM and early cAMR in sensitized heart allograft recipients are correlated with SAB-C1q reactivity.


Annals of Surgery | 2013

Upper-extremity transplantation using a cell-based protocol to minimize immunosuppression.

Stefan Schneeberger; Vijay S. Gorantla; Gerald Brandacher; Adriana Zeevi; Anthony J. Demetris; John G. Lunz; Albert D. Donnenberg; Jaimie T. Shores; Andrea F. DiMartini; Joseph E. Kiss; Joseph E. Imbriglia; Kodi Azari; Robert J. Goitz; Ernest K. Manders; Vu T. Nguyen; Damon S. Cooney; Galen S. Wachtman; Jonathan D. Keith; Derek R. Fletcher; Camila Macedo; Raymond M. Planinsic; Joseph E. Losee; Ron Shapiro; Thomas E. Starzl; W. P. Andrew Lee

Objective: To minimize maintenance immunosuppression in upper-extremity transplantation to favor the risk-benefit balance of this procedure. Background: Despite favorable outcomes, broad clinical application of reconstructive transplantation is limited by the risks and side effects of multidrug immunosuppression. We present our experience with upper-extremity transplantation under a novel, donor bone marrow (BM) cell-based treatment protocol (“Pittsburgh protocol”). Methods: Between March 2009 and September 2010, 5 patients received a bilateral hand (n = 2), a bilateral hand/forearm (n = 1), or a unilateral (n = 2) hand transplant. Patients were treated with alemtuzumab and methylprednisolone for induction, followed by tacrolimus monotherapy. On day 14, patients received an infusion of donor BM cells isolated from 9 vertebral bodies. Comprehensive follow-up included functional evaluation, imaging, and immunomonitoring. Results: All patients are maintained on tacrolimus monotherapy with trough levels ranging between 4 and 12 ng/mL. Skin rejections were infrequent and reversible. Patients demonstrated sustained improvements in motor function and sensory return correlating with time after transplantation and level of amputation. Side effects included transient increase in serum creatinine, hyperglycemia managed with oral hypoglycemics, minor wound infection, and hyperuricemia but no infections. Immunomonitoring revealed transient moderate levels of donor-specific antibodies, adequate immunocompetence, and no peripheral blood chimerism. Imaging demonstrated patent vessels with only mild luminal narrowing/occlusion in 1 case. Protocol skin biopsies showed absent or minimal perivascular cellular infiltrates. Conclusions: Our data suggest that this BM cell-based treatment protocol is safe, is well tolerated, and allows upper-extremity transplantation using low-dose tacrolimus monotherapy.


American Journal of Transplantation | 2013

Comprehensive assessment and standardization of solid phase multiplex-bead arrays for the detection of antibodies to HLA.

Elaine F. Reed; Ping Rao; Zilu Zhang; Howard M. Gebel; Robert A. Bray; Indira Guleria; John G. Lunz; Thalachallour Mohanakumar; Peter Nickerson; Anat R. Tambur; Adriana Zeevi; Peter S. Heeger; David W. Gjertson

Solid phase multiplex‐bead arrays for the detection and characterization of HLA antibodies provide increased sensitivity and specificity compared to conventional lymphocyte‐based assays. Assay variability due to inconsistencies in commercial kits and differences in standard operating procedures (SOP) hamper comparison of results between laboratories. The Clinical Trials in Organ Transplantation Antibody Core Laboratories investigated sources of assay variation and determined if reproducibility improved through utilization of SOP, common reagents and normalization algorithms. Ten commercial kits from two manufacturers were assessed in each of seven laboratories using 20 HLA reference sera. Implementation of a standardized (vs. a nonstandardized) operating procedure greatly reduced MFI variation from 62% to 25%. Although laboratory agreements exceeded 90% (R2), small systematic differences were observed suggesting center specific factors still contribute to variation. MFI varied according to manufacturer, kit, bead type and lot. ROC analyses showed excellent consistency in antibody assignments between manufacturers (AUC > 0.9) and suggested optimal cutoffs from 1000 to 1500 MFI. Global normalization further reduced MFI variation to levels near 20%. Standardization and normalization of solid phase HLA antibody tests will enable comparison of data across laboratories for clinical trials and diagnostic testing.


Annals of Surgery | 2012

Long-term survival, nutritional autonomy, and quality of life after intestinal and multivisceral transplantation.

Kareem Abu-Elmagd; Beverly Kosmach-Park; Guilherme Costa; Mazen S. Zenati; L Martin; Darlene A. Koritsky; Maureen Emerling; Noriko Murase; Geoffrey Bond; Kyle Soltys; Hiroshi Sogawa; John G. Lunz; Motaz Al Samman; Nico Shaefer; Rakesh Sindhi; George V. Mazariegos

Objective:To assess long-term survival, graft function, and health-related quality of life (QOL) after visceral transplantation. Background:Despite continual improvement in early survival, the long-term therapeutic efficacy of visceral transplantation has yet to be defined. Methods:A prospective cross-sectional study was performed on 227 visceral allograft recipients who survived beyond the 5-year milestone. Clinical data were used to assess outcome including graft function and long-term survival predictors. The socioeconomic milestones and QOL measures were assessed by clinical evaluation, professional consultation, and validated QOL inventory. Results:Of 376 recipients, 227 survived beyond 5 years, with conditional survival of 75% at 10 years and 61% at 15 years. With a mean follow-up of 10 ± 4 years, 177 (92 adults, 85 children) are alive, with 118 (67%) recipients 18 years or older. Nonfunctional social support and noninclusion of the liver in the visceral allograft are the most significant survival risk factors. Nutritional autonomy was achievable in 160 (90%) survivors, with current serum albumin level of 3.7 ± 0.5 gm/dL and body mass index of 25 ± 6 kg/m2. Despite coexistence or development of neuropsychiatric disorders, most survivors were reintegrated to society with self-sustained socioeconomic status. In parallel, most of the psychological, emotional, and social QOL measures significantly (P < 0.05) improved after transplantation. Current morbidities with potential impact on global health included dysmotility (59%), hypertension (37%), osteoporosis (22%), and diabetes (11%), with significantly (P < 0.05) higher incidence among adult recipients. Conclusions:With new tactics to further improve long-term survival including social support measures, visceral transplantation has achieved excellent nutritional autonomy and good QOL.


American Journal of Transplantation | 2012

Preformed and de novo donor specific antibodies in visceral transplantation: long-term outcome with special reference to the liver.

Kareem Abu-Elmagd; G. Wu; Guilherme Costa; John G. Lunz; L Martin; Darlene A. Koritsky; Noriko Murase; William Irish; A. Zeevi

Despite improvement in early outcome, rejection particularly chronic allograft enteropathy continues to be a major barrier to long‐term visceral engraftment. The potential role of donor specific antibodies (DSA) was examined in 194 primary adult recipients. All underwent complement‐dependent lymphocytotoxic crossmatch (CDC‐XM) with pre‐ and posttransplant solid phase HLA–DSA assay in 156 (80%). Grafts were ABO‐identical with random HLA‐match. Liver was included in 71 (37%) allografts. Immunosuppression was tacrolimus‐based with antilymphocyte recipient pretreatment in 150 (77%). CDC‐XM was positive in 55 (28%). HLA–DSA was detectable before transplant in 49 (31%) recipients with 19 continuing to have circulating antibodies. Another 19 (18%) developed de novo DSA. Ninety percent of patients with preformed DSA harbored HLA Class‐I whereas 74% of recipients with de novo antibodies had Class‐II. Gender, age, ABO blood‐type, cold ischemia, splenectomy and allograft type were significant DSA predictors. Preformed DSA significantly (p < 0.05) increased risk of acute rejection. Persistent and de novo HLA–DSA significantly (p < 0.001) increased risk of chronic rejection and associated graft loss. Inclusion of the liver was a significant predictor of better outcome (p = 0.004, HR = 0.347) with significant clearance of preformed antibodies (p = 0.04, OR = 56) and lower induction of de novo DSA (p = 0.07, OR = 24). Innovative multifaceted anti‐DSA strategies are required to further improve long‐term survival particularly of liver‐free allografts.


American Journal of Pathology | 2001

Replicative Senescence of Biliary Epithelial Cells Precedes Bile Duct Loss in Chronic Liver Allograft Rejection : Increased Expression of p21WAF1/Cip1 as a Disease Marker and the Influence of Immunosuppressive Drugs

John G. Lunz; Sarah Contrucci; Kris Ruppert; Noriko Murase; John J. Fung; Thomas E. Starzl; Anthony J. Demetris

Early chronic liver allograft rejection (CR) is characterized by distinctive cytological changes in biliary epithelial cells (BECs) that resemble cellular senescence, in vitro, and precede bile duct loss. If patients suffering from early CR are treated aggressively, the clinical and histopathological manifestations of CR can be completely reversed and bile duct loss can be prevented. We first tested whether the senescence-related p21(WAF1/Cip1) protein is increased in BECs during early CR, and whether treatment reversed the expression. The percentage of p21+ BECs and the number of p21+ BECs per portal tract is significantly increased in early CR (26 +/- 17% and 3.6 +/- 3.1) compared to BECs in normal liver allograft biopsies or those with nonspecific changes (1 +/- 1% and 0.1 +/- 0.3; P: < 0.0001 and P: < 0.02), chronic hepatitis C (2 +/- 3% and 0.7 +/- 1; P: < 0.0001 and P: < 0.04) or obstructive cholangiopathy (7 +/- 7% and 0.7 +/- 0.6; P: < 0.006 and P: = 0.04). Successful treatment of early CR is associated with a decrease in the percentage of p21+ BECs and the number of p21+ BECs per portal tract. In vitro, nuclear p21(WAF1/Cip1) expression is increased in large and multinucleated BECs, and is induced by transforming growth factor (TGF)-beta. TGF-beta1 also increases expression of TGF-beta receptor II, causes phosphorylation of SMAD-2 and nuclear translocation of p21(WAF1/Cip1), which inhibits BEC growth. Because conversion from cyclosporine to tacrolimus is an effective treatment for early CR, we next tested whether these two immunosuppressive drugs directly influenced BEC growth in vitro. The results show that cyclosporine, but not tacrolimus, stimulates BEC TGF-beta1 production, which in turn, causes BEC mito-inhibition and up-regulation of nuclear p21(WAF1/Cip1). In conclusion, expression of the senescence-related p21(WAF1/Cip1) protein is increased in BECs during early CR and decreases with successful recovery. Replicative senescence accounts for the characteristic BEC cytological alterations used for the diagnosis of early CR and lack of a proliferative response to injury. The ability of cyclosporine to inhibit the growth of damaged BECs likely accounts for the relative duct sparing properties of tacrolimus.


Transplant International | 2009

Monitoring of human liver and kidney allograft tolerance: a tissue/histopathology perspective

Anthony J. Demetris; John G. Lunz; Parmjeet Randhawa; Tong Wu; Michael A. Nalesnik; Angus W. Thomson

Several factors acting together have recently enabled clinicians to seriously consider whether chronic immunosuppression is needed in all solid organ allograft recipients. This has prompted a dozen or so centers throughout the world to prospectively wean immunosuppression from conventionally treated liver allograft recipients. The goal is to lessen the impact of chronic immunosuppression and empirically identify occasional recipients who show operational tolerance, defined as gross phenotype of tolerance in the presence of an immune response and/or immune deficit that has little or no significant clinical impact. Rare operationally tolerant kidney allograft recipients have also been identified, usually by single case reports, but only a couple of prospective weaning trials in conventionally treated kidney allograft recipients have been attempted and reported. Pre‐ and postweaning allograft biopsy monitoring of recipients adds a critical dimension to these trials, not only for patient safety but also for determining whether events in the allografts can contribute to a mechanistic understanding of allograft acceptance. The following is based on a literature review and personal experience regarding the practical and scientific aspects of biopsy monitoring of potential or actual operationally tolerant human liver and kidney allograft recipients where the goal, intended or attained, was complete withdrawal of immunosuppression.


Human Immunology | 2009

Emerging role of donor-specific anti- human leukocyte antigen antibody determination for clinical management after solid organ transplantation

Adriana Zeevi; John G. Lunz; Ron Shapiro; Parmjeet Randhawa; George V. Mazariegos; Steven A. Webber; Alin Girnita

Preformed and de novo donor-specific HLA antibodies (DSA) have been associated with allograft dysfunction and failure. The application of solid-phase methods have increased the sensitivity and specificity of antibody detection; however the clinical significance of these DSA is under evaluation. In the present study, we summarize six cases (four renal transplant recipients, one multivisceral recipient, and one heart-and-lung transplant recipient) to illustrate the role of the histocompatibility laboratory in providing the most comprehensive workup to assess the risk of graft dysfunction associated with antibody-mediated rejection (AMR). These cases illustrate the potential risk assessment for AMR in various situations: (1) in patients exhibiting low levels of DSA pretransplantation; (2) protocol immunosuppression minimization during stepwise weaning; and (3) desensitization protocols. Furthermore, increased sensitivity of DSA determination is indicated for the interpretation of focal C4d and its clinical significance. The clinical relevance of monitoring for circulating DSA with solid-phase single-antigen assays is also discussed. These cases exemplify the rationale for all patients to be monitored for DSA post-transplantation, with the frequency adjusted based on the individual risk for AMR.


Hepatology | 2005

An inhibitor of cyclin‐dependent kinase, stress‐induced p21Waf‐1/Cip‐1, mediates hepatocyte mito‐inhibition during the evolution of cirrhosis

John G. Lunz; Hirokazu Tsuji; Isao Nozaki; Noriko Murase; Anthony J. Demetris

During the evolution of cirrhosis, there is a relative decrease in volume percentage of hepatocytes and a relative increase in biliary epithelial cells and myofibroblasts. This is recognized histopathologically as a ductular reaction and leads to gradual distortion of the normal hepatic architecture. The final or decompensated stage of cirrhosis is characterized by a further decline in hepatocyte proliferation and loss of functional liver mass that manifests clinically as ascites, encephalopathy, and other signs of liver failure. In this report, we tested the hypothesis that p21‐mediated hepatocyte mito‐inhibition accelerates the evolution of cirrhosis using an established mouse model of decompensated biliary cirrhosis, p21‐deficient mice, and liver tissue from humans awaiting liver replacement. Despite the same insult of long‐term (12‐week) bile duct ligation, mice prone to decompensation showed significantly more oxidative stress and hepatocyte nuclear p21 expression, which resulted in less hepatocyte proliferation, an exaggerated ductular reaction, and more advanced disease compared with compensation‐prone controls. Mice deficient in p21 were better able than wild‐type controls to compensate for long‐term bile duct ligation because of significantly greater hepatocyte proliferation, which led to a larger liver mass and less architectural distortion. Mito‐inhibitory hepatocyte nuclear p21 expression in humans awaiting liver replacement directly correlated with pathological disease stage and model of end‐stage liver disease scoring. In conclusion, stress‐induced upregulation of hepatocyte p21 inhibits hepatocyte proliferation during the evolution of cirrhosis. These findings have implications for understanding the evolution of cirrhosis and associated carcinogenesis. Supplementary material for this article can be found on the HEPATOLOGY website (http://interscience.wiley.com/jpages/0270‐9139/suppmat/index.html). (HEPATOLOGY 2005.)


Hepatology | 2007

Gut‐derived commensal bacterial products inhibit liver dendritic cell maturation by stimulating hepatic interleukin‐6/signal transducer and activator of transcription 3 activity

John G. Lunz; Susan Specht; Noriko Murase; Kumiko Isse; Anthony J. Demetris

Intraorgan dendritic cells (DCs) monitor the environment and help translate triggers of innate immunity into adaptive immune responses. Liver‐based DCs are continually exposed, via gut‐derived portal venous blood, to potential antigens and bacterial products that can trigger innate immunity. However, somehow the liver avoids a state of perpetual inflammation and protects central immune organs from overstimulation. In this study, we tested the hypothesis that hepatic interleukin‐6 (IL‐6)/signal transducer and activator of transcription 3 (STAT3) activity increases the activation/maturation threshold of hepatic DCs toward innate immune signals. The results show that the liver nuclear STAT3 activity is significantly higher than that of other organs and is IL‐6–dependent. Hepatic DCs in normal IL‐6 wild‐type (IL‐6+/+) mice are phenotypically and functionally less mature than DCs from IL‐6–deficient (IL‐6−/−) or STAT3‐inhibited IL‐6+/+ mice, as determined by surface marker expression, proinflammatory cytokine secretion, and allogeneic T‐cell stimulation. IL‐6+/+ liver DCs produce IL‐6 in response to exposure to lipopolysaccharide (LPS) and cytidine phosphate guanosine oligonucleotides (CpG) but are resistant to maturation compared with IL‐6−/− liver DCs. Conversely, exogenous IL‐6 inhibits LPS‐induced IL‐6−/− liver DC maturation. IL‐6/STAT3 signaling influences the liver DC expression of toll‐like receptor 9 and IL‐1 receptor associated kinase‐M. The depletion of gut commensal bacteria in IL‐6+/+ mice with oral antibiotics decreased portal blood endotoxin levels, lowered the expression of IL‐6 and phospho‐STAT3, and significantly increased liver DC maturation. Conclusion: Gut‐derived bacterial products, by stimulating hepatic IL‐6/STAT3 signaling, inhibit hepatic DC activation/maturation and thereby elevate the threshold needed for translating triggers of innate immunity into adaptive immune responses. Manipulating gut bacteria may therefore be an effective strategy for altering intrahepatic immune responses. (HEPATOLOGY 2007.)

Collaboration


Dive into the John G. Lunz's collaboration.

Top Co-Authors

Avatar

Adriana Zeevi

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Susan Specht

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

Noriko Murase

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

A. Zeevi

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brian Feingold

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ron Shapiro

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge