R. Michael Hofmann
University of Wisconsin-Madison
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by R. Michael Hofmann.
Clinical Journal of The American Society of Nephrology | 2006
Arjang Djamali; Millie Samaniego; Brenda Muth; Rebecca J. Muehrer; R. Michael Hofmann; John D. Pirsch; A. Howard; Georges Mourad; Bryan N. Becker
Kidney transplantation is the treatment of choice for patients with ESRD. Despite improvements in short-term patient and graft outcomes, there has been no major improvement in long-term outcomes. The use of kidney allografts from expanded-criteria donors, polyoma virus nephropathy, underimmunosuppression, and incomplete functional recovery after rejection episodes may play a role in the lack of improvement in long-term outcomes. Other factors, including cardiovascular disease, infections, and malignancies, also shorten patient survival and therefore reduce the functional life of an allograft. There is a need for interventions that improve long-term outcomes in kidney transplant recipients. These patients are a unique subset of patients with chronic kidney disease. Therefore, interventions need to address disease progression, comorbid conditions, and patient mortality through a multifaceted approach. The Kidney Disease Outcomes Quality Initiative from the National Kidney Foundation, the European Best Practice Guidelines, and the forthcoming Kidney Disease: Improving Global Outcomes clinical practice guidelines can serve as a cornerstone of this approach. The unique aspects of chronic kidney disease in the transplant recipient require the integration of specific transplant-oriented problems into this care schema and a concrete partnership among transplant centers, community nephrologists, and primary care physicians. This article reviews the contemporary aspects of care for these patients.
Transplantation | 2010
Neeraj Singh; Arjang Djamali; David Lorentzen; John D. Pirsch; Glen Leverson; Nikole Neidlinger; Barbara Voss; Jose Torrealba; R. Michael Hofmann; Jon S. Odorico; Luis A. Fernandez; Hans W. Sollinger; Milagros Samaniego
Background. The clinical significance of pretransplant donor-specific antibodies (pre-Tx DSAs) detected by single-antigen bead flow cytometry (SAB-FC) remains unclear. Methods. To investigate the impact that pre-Tx DSAs detected by SAB-FC have on early clinical outcomes, we tested pre-Tx sera from all consecutive deceased-donor kidney transplants performed between January 2005 and July 2006 (n=237). Results. In the study population of which 66% had a high-immunologic risk, mean fluorescence intensity (MFI) more than or equal to 100 for class I and more than or equal to 200 for class II were the lowest DSA thresholds associated with inferior antibody-mediated rejection-free graft survival (75% vs. 90%, P=0.004 and 76% vs. 87%, P=0.017, respectively). The hazard ratio for antibody-mediated rejection increased linearly with higher class II DSA from MFI 100 to 800 (1.7[0.8–3.2], P=0.1 for MFI≥100 vs. 4.7[2.4–8.8], P<0.001 for MFI ≥800). Differences in graft function were only evident in patients with class II MFI more than or equal to 500 (estimated glomerular filtration rate: 47.6 vs. 54.3, P=0.02 and proteinuria: 0.6±0.6 vs. 0.4±0.3, P=0.03). A difference in death-censored graft survival was detected in patients with class II MFI more than or equal to 1000 (75% vs. 91.9%, P=0.055). Conclusion. High-pre-Tx DSAs detected by SAB-FC are associated with incrementally poor graft outcomes in deceased-donor kidney transplant with high-immunologic risk.
Seminars in Dialysis | 2007
Alexander S. Yevzlin; Robert J. Sanchez; Jeanne G. Hiatt; Marilyn H. Washington; Maureen Wakeen; R. Michael Hofmann; Yolanda T. Becker
Vascular access complications, including thrombosis, are associated with significant patient morbidity and mortality. Currently, up to 60% of new patients and 30% of prevalent patients are using a catheter for dialysis. To prevent interdialytic catheter thrombosis, these devices are routinely locked with concentrated heparin solutions. Several recent studies have elucidated the potential for abnormal coagulation markers (aPTT) that may arise from this practice. This abnormal elevation in aPTT may be explained by significant early and late leakage from the catheter that occurs after performing a catheter lock. To date no study has evaluated the impact of this practice, or the elevation in aPTT that may result from it, on bleeding complication rates. We conducted a retrospective analysis comparing bleeding rates in subjects who received concentrated heparin catheter lock (5000 u/cc) [group 1, n = 52] to those who received citrate or dilute heparin catheter lock (1000 u/cc) [group 2, n = 91] immediately after tunneled hemodialysis catheter insertion. Baseline characteristics did not differ between the groups except for the preprocedure INR, which was higher in the postpolicy group compared with the prepolicy group (1.29 vs. 1.21, p = 0.04). Results from logistic regression analyses revealed that the likelihood of a composite bleeding event in group 1 was 11.9 times that of a composite bleeding event in group 2, p = 0.04. Concentrated heparin (5000 u/ml) is associated with increased major bleeding complications posttunneled catheter placement compared with low‐dose heparin (1000 u/ml) or citrate catheter lock solution, p = 0.02. Given the findings of this study, a randomized controlled trial comparing the safety and efficacy of common anticoagulation lock solutions is warranted.
Renal Failure | 2002
R. Michael Hofmann; R.N. Christine Maloney; David M. Ward; Bryan N. Becker
Background: Continuous renal replacement therapy (CRRT) is increasingly used in managing acute renal failure (ARF) as it offers hemodynamic stability and significant solute clearance in this setting. However, it also requires anticoagulation. Traditionally, heparin has been the anticoagulant of choice but this increases hemorrhagic risk in already high-risk ARF patients. Regional citrate anticoagulation offsets this risk. However, it can be difficult to manipulate regional anticoagulation in CRRT. Moreover, citrate CRRT has been plagued by short optimal filter patency times. Methods: We designed a novel citrate-based anticoagulation schema for continuous venovenous hemofiltration (CVVHF). We implemented this schema prospectively in caring for 24 individuals admitted to the intensive care unit with ARF requiring CRRT. Each individual had a contraindication to systemic anticoagulation. We evaluated filter patency using Kaplan-Meier methodology, comparing the effect of this citrate-CVVHF system to historical, saline-flush control CVVHF systems. Results: 58 filters ran for a total of 2637.5 h. Average filter patency time was 45.4 ± 25.5 h. At 48 h, 70% of the CVVHF-citrate system filters remained patent compared to only 16% of historical control saline-flush systems (p = 0.0001). The average filtered urea nitrogen/blood urea nitrogen ratio was 0.84 ± 0.06 with an average urea clearance of 28.5 ± 4.1 mL/min for CVVHF-citrate-treated individuals. Only three patients experienced transient complications related to CVVHF-citrate with resolution of these complications within 24 h. Ultimately, 58.3% of the CVVHF-citrate-treated patients survived to ICU discharge. Conclusions: This novel CVVHF-citrate system achieved excellent clearance and dramatically improved filter patency compared to saline-flush systems. Moreover, it did so with minimal toxicity.
Clinical Journal of The American Society of Nephrology | 2014
Belinda T. Lee; Steven Gabardi; Monica Grafals; R. Michael Hofmann; Enver Akalin; Aws Aljanabi; Didier A. Mandelbrot; Deborah B. Adey; Eliot Heher; Pang Yen Fan; Sarah Conte; Christine Dyer-Ward; Anil Chandraker
BACKGROUND AND OBJECTIVES BK virus reactivation in kidney transplant recipients can lead to progressive allograft injury. Reduction of immunosuppression remains the cornerstone of treatment for active BK infection. Fluoroquinolone antibiotics are known to have in vitro antiviral properties, but the evidence for their use in patients with BK viremia is inconclusive. The objective of the study was to determine the efficacy of levofloxacin in the treatment of BK viremia. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS Enrollment in this prospective, multicenter, double-blinded, placebo-controlled trial occurred from July 2009 to March 2012. Thirty-nine kidney transplant recipients with BK viremia were randomly assigned to receive levofloxacin, 500 mg daily, or placebo for 30 days. Immunosuppression in all patients was adjusted on the basis of standard clinical practices at each institution. Plasma BK viral load and serum creatinine were measured monthly for 3 months and at 6 months. RESULTS At the 3-month follow-up, the percentage reductions in BK viral load were 70.3% and 69.1% in the levofloxacin group and the placebo group, respectively (P=0.93). The percentage reductions in BK viral load were also equivalent at 1 month (58% versus and 67.1%; P=0.47) and 6 months (82.1% versus 90.5%; P=0.38). Linear regression analysis of serum creatinine versus time showed no difference in allograft function between the two study groups during the follow-up period. CONCLUSIONS A 30-day course of levofloxacin does not significantly improve BK viral load reduction or allograft function when used in addition to overall reduction of immunosuppression.
Transplant International | 2012
John C. LaMattina; Joshua D. Mezrich; R. Michael Hofmann; David P. Foley; Anthony M. D’Alessandro; Hans W. Sollinger; John D. Pirsch
Between 1 January 2002 and 31 December 2007, our center performed 1687 adult renal transplants. A retrospective analysis was performed to compare outcomes between patients receiving alemtuzumab (n = 632) and those receiving either basiliximab (n = 690) or thymoglobulin (n = 125). Patients receiving alemtuzumab were younger (49 vs. 51 years, P = 0.02), had fewer HLA matches (1.7 vs. 2.0, P < 0.0001), were more likely to have a cytomegalovirus (CMV) donor(+)/recipient(−) transplant (22% vs. 17%, P = 0.03) and were less likely to receive a living donor allograft (32% vs. 37%, P = 0.04). Alemtuzumab recipients were less likely to receive tacrolimus (35% vs. 47%, P < 0.0001). The 1‐, 3‐, and 5‐year cumulative incidence of antibody‐mediated rejection (AMR) in alemtuzumab‐treated patients was 19%, 24%, and 27%, vs. 11%, 15%, and 18% for the other group (P < 0.0001). The 1‐, 3‐, and 5‐year allograft survival in the alemtuzumab group was 88%, 75%, and 67%, vs. 91%, 82%, and 74% for the other group (P < 0.0001). Patient survival was equivalent. Alemtuzumab was an independent risk factor for living donor allograft loss (HR 2.0, P = 0.004), opportunistic infections (HR 1.3, P = 0.01), CMV infections (HR 1.6, P = 0.001), and AMR (HR 1.5, P = 0.002). The significantly worse graft survival in the alemtuzumab cohort may be due to the increased rates of AMR and infectious complications.
Kidney International | 2013
Arjang Djamali; Brenda Muth; Thomas M. Ellis; Maha Mohamed; Luis A. Fernandez; Karen Miller; Janet M. Bellingham; Jon S. Odorico; Joshua D. Mezrich; John D. Pirsch; Tony M. D'Alessandro; Vijay Vidyasagar; R. Michael Hofmann; Jose Torrealba; Dixon B. Kaufman; David P. Foley
In order to define the intensity of immunosuppression, we examined risk factors for acute rejection in desensitization protocols that use baseline donor specific antibody levels measured as mean fluorescence intensity (MFImax). The study included 146 patients transplanted with a negative flow crossmatch and a mean follow-up of 18 months with the majority (83%) followed for at least 1 year. At the time of transplant, mean calculated panel reactive antibody and MFImax ranged from 10.3% to 57.2%, and 262 to 1691, respectively, between low and high-risk protocols. Mean MFImax increased significantly from transplant to one-week and one-year. The incidence of acute rejection (mean 1.65 months) as a combination of clinical and subclinical rejection was 32% including 14% cellular, 12% antibody-mediated and 6% mixed rejection. In regression analyses, only C4d staining in post-reperfusion biopsies (hazard ratio 3.3, confidence interval 1.71 to 6.45) and increased donor specific antibodies at 1 week post-transplant were significant predictors of rejection. A rise in MFImax by 500 was associated with a 2.8-fold risk of rejection. Thus, C4d staining in post-reperfusion biopsies and an early rise in donor specific antibodies after transplantation are risk factors for rejection in moderately sensitized patients.
Kidney International | 2013
Brad C. Astor; Brenda Muth; Dixon B. Kaufman; John D. Pirsch; R. Michael Hofmann; Arjang Djamali
Serum β(2)-microglobulin (β(2)M), a novel marker of kidney function, predicts mortality and kidney failure in the general population, and its elevation following transplantation is a marker of acute rejection. The association between post-transplant serum β(2)M and outcomes following kidney transplantation, however, is unknown. To help determine this, we conducted a retrospective cohort study of 2190 individuals receiving a primary kidney transplant with serum β(2)M measured at discharge. A total of 452 deaths and 347 graft failures before death (669 total graft losses) occurred over a median of 4.1 years of follow-up. After adjustment, the highest quintile of β(2)M (5.0 mg/l and above), compared with the lowest quintile (<2.3 mg/l), was associated with a hazard ratio of 4.6 (95% confidence interval 2.8, 7.5) for death, 4.1 (2.4, 7.0) for death-censored graft loss, and 3.8 (2.5, 5.6) for total graft loss. Serum β(2)M was more strongly associated with each outcome than was serum creatinine. Higher serum β(2)M at discharge was independently associated with each outcome in models stratified by the presence of delayed graft function, donor type, or estimated glomerular filtration rate at discharge. Thus, serum β(2)M at discharge is a potent predictor of long-term mortality and graft loss in kidney transplant recipients, providing information on allograft function beyond that of serum creatinine.
Nutrition in Clinical Practice | 2011
Laura Maursetter; Cassandra E. Kight; Judy Mennig; R. Michael Hofmann
Continuous renal replacement therapy (CRRT) is a common treatment modality in the intensive care unit for patients with acute kidney injury requiring renal replacement therapy. It offers hemodynamic stability while maintaining excellent control of solute and extracellular fluid. To those outside of nephrology, continuous dialysis is often a confusing and poorly understood form of renal replacement therapy. This review aims to provide an overview of CRRT as well as address some of the nutrition concerns surrounding this complex group of patients.
American Journal of Transplantation | 2008
R. Michael Hofmann
Renal transplantation has been shown to not only improve the quality of life, but also survival in recipients (1). However, with the increasing number of people joining the waiting list, there is increasing pressure to carefully screen patients as to their suitability as potential recipients. Patients with end-stage renal disease (ESRD) have an excessive risk of dying of cardiovascular causes compared to the general population (2), and this seems to mirror the increased incidence of coronary artery disease (CAD) in patients with ESRD. Consequently, cardiac evaluation prior to renal transplantation has been a pillar of this evaluation process. But the best method for screening potential recipients remains a subject of debate.