Richard J. Knight
Cornell University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Richard J. Knight.
Transplantation | 2005
Barry D. Kahan; Yarkin K. Yakupoglu; L. Schoenberg; Richard J. Knight; Stephen M. Katz; Deijan Lai; Charles T. Van Buren
Background. Malignancies, a well-known complication of immunosuppressive therapy in renal transplant recipients, represent an important cause of long-term morbidity and mortality. One approach to addressing this problem is identifying agents that display antineoplastic properties concomitant with their immunosuppressive effects. Methods. We examined the neoplasms among 1008 renal transplant recipients treated at a single center with sirolimus-cyclosporine ± prednisone. Results. Clinical and laboratory data, including 62.3±26.1 months follow-up (range 27.1–131), revealed 36 tumors in 35 patients (3.6%) presenting at 32.5±29.8 months. The 2.4% incidence of skin tumors, the most common neoplasms, was 1.58-fold greater than the general U.S. population. In addition to a 0.4% incidence of posttransplant lymphoproliferative disorders (PTLD) and a 0.2% incidence of renal cell carcinomas, we observed single cases of breast, bladder, endometrial, lung, and brain neoplasms as well as leukemia. The mean trough drug concentrations at the time of diagnosis in affected recipients were within our putative target ranges. In addition to eleven graft losses due to death with a functioning kidney, two were related to chronic rejection following reduced immunosuppression, and one, therapeutic nephrectomy for PTLD. Five of twelve deaths were caused by malignancies; four others among 1008 patients over the entire follow-up were attributed to cardiovascular events; one, to respiratory failure; and two, at distant locations to unknown causes. Conclusions. The sirolimus-cyclosporine ± prednisone combination appears likely to be associated with a reduced incidence of tumors.
Clinical Transplantation | 2007
Richard J. Knight; Martin Villa; Robert Laskey; Carlos Benavides; L. Schoenberg; Maria Welsh; Ronald H. Kerman; Hemangshu Podder; Charles T. Van Buren; Stephen M. Katz; Barry D. Kahan
Abstract: Aim: As sirolimus has been implicated in impaired wound healing, the aim of this study was to evaluate risk factors for wound complications after renal transplantation in patients treated with this drug de novo.
Transplantation | 2007
Carlos Benavides; Vida B. Pollard; Shamila Mauiyyedi; Hemangshu Podder; Richard J. Knight; Barry D. Kahan
Background. Because the course of polyoma virus–associated nephropathy (PVAN) has not been evaluated in a large cohort of patients receiving sirolimus (SRL)-based regimens, we have herein presented the incidence, clinical characteristics, and outcomes of 378 renal transplant recipients treated with SRL-based immunosuppression. Methods. This retrospective single center study evaluated 344 kidney alone (KTX) and 34 simultaneous pancreas-kidney (SPK) transplantations performed between June 2000 and December 2004. Results. At a mean follow-up of 43.3 months, six kidney (1.7%) and three kidney-pancreas (9.0%) transplanted patients displayed biopsy-proven PVAN. The mean time to diagnosis after transplantation was 18.2 months (range: 3.5–31.1 months), with a higher incidence among patients exposed (4.23%) versus not exposed to rabbit antithymocyte globulin (rATG; 0.53%; P=0.019) or SPK (9.0%) versus KTX (1.7%) recipients (odds ratio: 5.43; confidence interval: 1.29–22.8; P=0.038). Despite treatment with cidofovir, reduced immunosuppression and maintenance therapy with no agents other than SRL (C0=10.2±2.7 ng/dL) plus modest doses of prednisone (≤5 mg), five patients (55.5%) experienced renal allograft failure. No rejection episodes were documented during the PVAN treatment and pancreatic function continued to be excellent among the SPK patients. Conclusions. Patients treated with SRL-based immunosuppression showed an incidence at the lower end of the range described with various other contemporaneous immunosuppressive regimens and with other cohorts not undergoing BK virus polymerase chain reaction surveillance. Exposure to rATG and SPK transplantation represented risk factors for the occurrence of PVAN, which showed a pernicious course despite withdrawal of calcineurin antagonists and/or mycophenolate mofetil.
Transplantation | 1997
Richard J. Knight; Steven Dikman; Hui Liu; Giorgio P. Martinelli
BACKGROUND The pathogenesis of chronic rejection likely involves an interplay between immunogenic and nonimmunogenic factors. The objective of this study was to determine the influence of cold ischemic preservation injury on the rate of progression to chronic rejection in the Lewis to F344 cardiac allograft model. METHODS To induce an ischemic injury, donor hearts were stored for 3 hr at 4 degrees C in University of Wisconsin solution before transplantation. Allografts were excised at 1, 7, and 90 days after transplantation or at rejection. Vasculopathy was graded for degree of intimal thickening based on the involvement of vascular perimeter and luminal compromise. RESULTS The degree of vessel injury in ischemic injured allografts at 90 days was significantly greater than in nonischemic injured allografts (2.8+/-0.4 vs. 1.6+/-0.5, P<0.05). Ischemic injury in syngeneic grafts did not induce a vasculopathy. Immunoperoxidase staining with R73 (anti-T cell) and ED1 (anti-macrophage) monoclonal antibodies revealed that, in ischemic injured allografts at 90 days after transplantation, the infiltrate was composed predominantly of T cells and macrophages. Additionally, ischemic injured allografts excised at 7 days after transplantation showed cellular infiltrates composed of R73-positive T cells and rare interleukin-2 receptor-positive cells, which was not observed in nonischemic allografts or ischemic syngeneic grafts. CONCLUSIONS The progression to chronic vasculopathy in this model is principally an immunologic process, which is accelerated by an ischemic insult to the allograft. The vascular injury is mediated in part by T cells and macrophages.
Transplantation | 2001
Efsevia Albanis; Qingsheng Jiao; Huong Tran; Carol Bodian; Richard J. Knight; Edgar L. Milford; Thomas D. Schiano; Yaron Tomer; Barbara Murphy
BACKGROUND Cytotoxic T-lymphocyte antigen 4 (CTLA4) has been shown to play a critical role in the down-regulation of the immune response. We retrospectively examined the association between acute rejection and two polymorphisms in the CTLA4 gene, the dinucleotide (AT)n repeat polymorphism in exon 3 and the single nucleotide polymorphism A/G at position 49 in exon 1, in a cohort of liver and kidney transplant recipients. METHODS AND RESULTS A total of 207 liver and 167 renal transplant recipients were analyzed. In the case of the (AT)n repeat polymorphism we found an increased incidence of acute rejection in association with allele 3 and 4 in both liver and kidney (P=0.002 and 0.05, respectively). In addition, in liver transplant recipients, allele 7 was associated with acute rejection independent of ethnicity (P<0.05). Allele 1 was less frequently observed in African American as compared with Caucasian liver and kidney transplant recipients, with a frequency of 33.8% and 69%, respectively (P<0.0001). Those patients with allele 1 had a tendency toward a lower rate of rejection at 42% versus 57.8% (P=0.058), suggesting a potential protective effect of allele 1. Analysis of the A/G single nucleotide polymorphism demonstrated no association between either allele and the incidence of acute rejection in the patients studied. CONCLUSION These initial observations provide the necessary basis to further investigate the risk stratification of transplant recipients based on specific CTLA4 gene polymorphisms.
Transplantation | 2004
Richard J. Knight; Ronald H. Kerman; L. Schoenberg; Hemangshu Podder; Charles T. Van Buren; Stephen M. Katz; Barry D. Kahan
Background. We previously reported that the use of basiliximab together with sirolimus permits a window of recovery from delayed graft function before the introduction of reduced-dose cyclosporine. The present study reviews our experience with the substitution of thymoglobulin for basiliximab as induction therapy for recipients at increased risk for early acute rejection episodes. Methods. We retrospectively reviewed 145 cadaveric renal allograft recipients who received either basiliximab (n=115) or thymoglobulin (n=30) in combination with sirolimus and prednisone, followed by delayed introduction of reduced doses of cyclosporine. Recipients were stratified as high immune responders if they were African American, a retransplant recipient, or a recipient with a panel-reactive antibody greater than 50%. All other recipients were considered low immune responders. Results. Basiliximab-treated high immune responders exhibited a higher incidence of acute rejection episodes (26%) than either basiliximab-treated low immune responders (10%, P=0.04) or thymoglobulin-treated high immune responders (3%, P=0.01). The median time to initiation of cyclosporine was 12 days; cyclosporine was initiated when the serum creatinine level was 2.5 mg/dL or less. Patients with early return of renal function displayed a lower incidence of acute rejection episodes than those with later recovery of function (P=0.003). High immune responders treated with basiliximab expressed a higher mean serum creatinine level at 3 months (P<0.01), 6 months (P=0.02) and 12 months (P=0.01) than either low immune responders treated with basiliximab or high immune responders treated with thymoglobulin. Conclusion. A strategy combining sirolimus with basiliximab for low-immunologic risk recipients and thymoglobulin for high-risk recipients leads to prompt recovery of renal function with a low risk of acute rejection episodes.
Liver Transplantation | 2010
Thomas A. Aloia; Richard J. Knight; A. Osama Gaber; R. Mark Ghobrial; John A. Goss
Older recipient age is associated with worse posttransplant survival. Although the median age of liver disease patients undergoing orthotopic liver transplantation (OLT) continues to rise, prognostic factors for posttransplant survival specific to older patients have not been defined. To address this issue, the United Network for Organ Sharing/Organ Procurement and Transplantation Network outcome database was searched to identify prognostic factors for the 8070 liver recipients 60 years old or older who underwent transplantation from 1994 to 2005. Prognostic factors were assessed with univariate analysis and multivariate modeling. The 5 strongest prognostic variables (ventilator status, diabetes mellitus, hepatitis C virus, creatinine levels ≥1.6 mg/dL, and recipient and donor age ≥120 years) were aggregated to define a novel older recipient prognostic score (ORPS). The overall 1‐ and 5‐year posttransplant survival rates were 83% and 67%, respectively. The risk model, created by the assignment of 1 point to each ORPS factor, stratified patient outcomes into distinct prognostic groups at the 1‐, 3‐, and 5‐year posttransplant time points (P < 0.001). The 5‐year survival rates for patients with ORPS values of 0, 1, and 2 points were 75%, 69%, and 58%, respectively. Patients who underwent transplantation with an ORPS > 2 points consistently experienced 5‐year survival rates of less than 50%. In conclusion, in liver transplant recipients 60 years old or older, the ORPS was able to predict significant and clinically relevant differences in posttransplant survival. By optimization of donor selection for recipients over the age of 60 years, clinical utilization of the ORPS model may enhance organ utilization for all patients awaiting OLT. Liver Transpl, 2010.
Transplantation | 2001
Richard J. Knight; Lewis Burrows; Carol Bodian
BACKGROUND We investigated whether recipients of living donor grafts who suffer an acute rejection progress to graft loss because of chronic rejection at a slower rate than recipients of cadaveric grafts. METHODS A retrospective review was made of 296 renal transplantations performed at Mount Sinai Hospital. Only grafts functioning for at least 3 months were included in this analysis. Demographic variables of donor and recipient age, race, sex, and serum creatinine at 3 months after transplantation were compared between groups. RESULTS Among the acute rejection-free cohort, the estimated 5-year graft survival was 90% for those receiving transplants from living relatives and 88% for those receiving cadaveric transplants (P=0.76). However, in grafts with early acute rejection, the 5-year survival was 40% for cadaveric recipients compared with 73% for living related graft recipients (P<0.014). Using the proportional hazards model, cadaveric donor source, older donor age, African American recipient race, and elevated 3-month serum creatinine were independent predictors of long-term graft loss caused by chronic rejection. The severity of acute rejection and recipient age had no impact on the risk of graft loss because of chronic rejection. CONCLUSION These data indicate that the benefit of living related transplantation results from the fact that a living related graft progresses from acute to chronic rejection at a slower rate than a cadaveric graft. Furthermore, a cadaveric graft that is free of acute rejection 3 months after transplantation has an equal likelihood of functioning at 5 years as that of a graft from a living related donor.
Transplantation | 1992
Kahan Bd; Maria Welsh; L. Rutzky; R. M. Lewis; Richard J. Knight; Stephen M. Katz; Kimberly L. Napoli; Joachim Grevel; C. T. Van Buren
Pretransplant test-dose pharmacokinetic profiles were used to determine individual cyclosporine drug bioavailability and clearance rates in renal transplant patients. Assuming a linear relation between dose and area under the concentration curve (AUC), starting i.v. and p.o. CsA doses were computed from the test-dose results. Target values were 400 ng/ml steady-state concentration (Css) during continuous intravenous infusion, and 500 ng/ml average drug concentration (Cavss = AUC/dosing interval) after oral administration, based upon measurements with the specific monoclonal antibody 3H-tracer radioimmunoassay. The outcomes after dose individualization with a 1-(n = 32), 2-(n = 38), or 3-(n = 41) hr i.v. infusion test dose and a p.o. test dose (n = 111) were compared with 228 historical control patients who received a uniform protocol of CsA i.v. at 2.5 mg/kg/day and p.o. at 14 mg/kg/day. The observed Css after i.v. CsA was within 10% of the target concentration in 73% of recipients tested with the 3-hr protocol, a significantly greater fraction than achieved with either the uniform dose (14%), or the 1-(34%) and 2-(25%) hr protocols. Patients in the 3-hr protocol group showed reduced incidences of delayed graft function, early graft loss, and rejection episodes, and a lower mean serum creatinine value, particularly at 7 but also at 30 days posttransplantation. Administration of the predicted oral dose produced a peak concentration of greater than or equal to 700 ng/ml drug absorption in 60% of recipients at 3 days, 90% at 5 days, and 98% at 7 days. The test-dose method less effectively predicted the appropriate oral CsA dose to produce target Cssav and failed to reduce the 90-day rejection incidence. Despite its limitations with the more-complicated p.o. route, the test-dose method successfully predicts i.v. CsA doses, thereby reducing the incidence of early adverse events.
Transplantation | 2014
J. DeVos; A. O. Gaber; Larry D. Teeter; Edward A. Graviss; Samir J. Patel; Geoffrey A. Land; Linda W. Moore; Richard J. Knight
Background Renal transplant recipients with de novo DSA (dDSA) experience higher rates of rejection and worse graft survival than dDSA-free recipients. This study presents a single-center review of dDSA monitoring in a large, multi-ethnic cohort of renal transplant recipients. Methods The authors performed a nested case-control study of adult kidney and kidney-pancreas recipients from July 2007 through July 2011. Cases were defined as dDSA-positive whereas controls were all DSA-negative transplant recipients. DSA were determined at 1, 3, 6, 9, and 12 months posttransplant, and every 6 months thereafter. Results Of 503 recipients in the analysis, 24% developed a dDSA, of whom 73% had dDSA against DQ antigen. Median time to dDSA was 6.1 months (range 0.2–44.6 months). After multivariate analysis, African American race, kidney-pancreas recipient, and increasing numbers of human leukocyte antigen mismatches were independent risk factors for dDSA. Recipients with dDSA were more likely to suffer an acute rejection (AR) (35% vs. 10%, P<0.001), an antibody-mediated AR (16% vs. 0.3%, P<0.001), an AR ascribed to noncompliance (8% vs. 2%, P=0.001), and a recurrent AR (6% vs. 1%, P=0.002) than dDSA-negative recipients. At a median follow-up of 31 months, the death-censored actuarial graft survival of dDSA recipients was worse than the DSA-free cohort (P=0.002). Yet, for AR-free recipients, there was no difference in graft survival between cohorts (P=0.66). Conclusions Development of dDSA was associated with an increased incidence of graft loss, yet the detrimental effect of dDSA was limited in the intermediate term to recipients with AR.