Ann M. Fieberg
University of Minnesota
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ann M. Fieberg.
Transplantation | 2010
Robert S. Gaston; J. Michael Cecka; B. L. Kasiske; Ann M. Fieberg; Robert E Leduc; F. Cosio; Sita Gourishankar; Joseph P. Grande; Philip F. Halloran; Lawrence G. Hunsicker; Roslyn B. Mannon; David Rush; Arthur J. Matas
Background. Late graft failure (LGF) is believed to be the consequence of immunologic and nonimmunologic insults leading to progressive deterioration in kidney function. We studied recipients with new onset late kidney graft dysfunction (n=173) to determine the importance of C4d staining and circulating donor-specific antibody (DSA) in subsequent LGF. Methods. One hundred seventy-three subjects transplanted before October 1, 2005 (mean time after transplant 7.3±6.0 years) had a baseline serum creatinine level of 1.4±0.3 mg/dL before January 1, 2006 and underwent biopsy for new onset graft dysfunction after that date (mean creatinine at biopsy 2.7±1.6 mg/dL). Statistical analysis was based on central DSA and blinded pathology determinations. Results. Subjects were divided into four groups based on C4d and DSA: no C4d, no DSA (group A; n=74); only DSA (group B; n=31); only C4d (group C; n=28); and both C4d and DSA (group D; n=40). Among DSA+ recipients (groups B and D), group D had broader reactivity and a stronger DSA response. After 2 years, groups C and D (C4d+) were at significantly greater risk for LGF than groups A and B. Adjusting for inflammation (Banff i, t, g, and ptc scores) did not change the outcome. Local diagnosis of calcineurin inhibitor nephrotoxicity was spread across all four subgroups and did not impact risk of LGF. Conclusions. Evidence of antibody-mediated injury (DSA or C4d) is common (57%) in patients with new onset late kidney allograft dysfunction. The risk of subsequent graft failure is significantly worse in the presence of C4d+ staining.
American Journal of Transplantation | 2010
Sita Gourishankar; Robert E Leduc; John E. Connett; J. M. Cecka; F. Cosio; Ann M. Fieberg; Robert S. Gaston; Philip F. Halloran; Lawrence G. Hunsicker; B. L. Kasiske; David Rush; Joseph P. Grande; Roslyn B. Mannon; Arthur J. Matas
We are studying two cohorts of kidney transplant recipients, with the goal of defining specific clinicopathologic entities that cause late graft dysfunction: (1) prevalent patients with new onset late graft dysfunction (cross‐sectional cohort); and (2) newly transplanted patients (prospective cohort). For the cross‐sectional cohort (n = 440), mean time from transplant to biopsy was 7.5 ± 6.1 years. Local pathology diagnoses included CAN (48%), CNI toxicity (30%), and perhaps surprisingly, acute rejection (cellular‐ or Ab‐mediated) (23%). Actuarial rate of death‐censored graft loss at 1 year postbiopsy was 17.7%; at 2 years, 29.8%. There was no difference in postbiopsy graft survival for recipients with versus without CAN (p = 0.9). Prospective cohort patients (n = 2427) developing graft dysfunction >3 months posttransplant undergo ‘index’ biopsy. The rate of index biopsy was 8.8% between 3 and 12 months, and 18.2% by 2 years. Mean time from transplant to index biopsy was 1.0 ± 0.6 years. Local pathology diagnoses included CAN (27%), and acute rejection (39%). Intervention to halt late graft deterioration cannot be developed in the absence of meaningful diagnostic entities. We found CAN in late posttransplant biopsies to be of no prognostic value. The DeKAF study will provide broadly applicable diagnostic information to serve as the basis for future trials.
American Journal of Transplantation | 2010
Arthur J. Matas; Robert E Leduc; David Rush; J. M. Cecka; John E. Connett; Ann M. Fieberg; Philip F. Halloran; Lawrence G. Hunsicker; F. Cosio; Joseph P. Grande; Roslyn B. Mannon; Sita Gourishankar; Robert S. Gaston; B. L. Kasiske
The nonspecific diagnoses ‘chronic rejection’‘CAN’, or ‘IF/TA’ suggest neither identifiable pathophysiologic mechanisms nor possible treatments. As a first step to developing a more useful taxonomy for causes of new‐onset late kidney allograft dysfunction, we used cluster analysis of individual Banff score components to define subgroups. In this multicenter study, eligibility included being transplanted prior to October 1, 2005, having a ‘baseline’ serum creatinine ≤2.0 mg/dL before January 1, 2006, and subsequently developing deterioration of graft function leading to a biopsy. Mean time from transplant to biopsy was 7.5 ± 6.1 years. Of the 265 biopsies (all with blinded central pathology interpretation), 240 grouped into six large (n > 13) clusters. There were no major differences between clusters in recipient demographics. The actuarial postbiopsy graft survival varied by cluster (p = 0.002). CAN and CNI toxicity were common diagnoses in each cluster (and did not differentiate clusters). Similarly, C4d and presence of donor specific antibody were frequently observed across clusters. We conclude that for recipients with new‐onset late graft dysfunction, cluster analysis of Banff scores distinguishes meaningful subgroups with differing outcomes.
Infection Control and Hospital Epidemiology | 2010
Amy C. Weintrob; Mollie P. Roediger; Mlt Melissa Barber; Bs Amy Summers; Ann M. Fieberg; Rn James Dunn; Rn Venus Seldon; Fluryanne Leach; Xiao-Zhe Huang; Mikeljon P. Nikolich; Glenn W. Wortmann
OBJECTIVEnTo determine the anatomic sites and natural history of colonization with gram-negative multidrug-resistant organisms (MDROs).nnnDESIGNnProspective, longitudinal cohort study.nnnSETTINGnWalter Reed Army Medical Center, a 236-bed tertiary care center in Washington, DC.nnnPATIENTSnDeployed subjects (ie, inpatients medically evacuated from Iraq or Afghanistan) or nondeployed subjects admitted to the same hospital.nnnMETHODSnConsenting patients had 6 anatomic sites cultured every 3 days for 2 weeks and then weekly. Gram-negative organisms resistant to 3 or more classes of antibiotics were considered MDROs. Isolates were genotyped using pulsed-field gel electrophoresis. Clinical data, data on antibiotic use, and clinical culture results were collected.nnnRESULTSnOf 60 deployed subjects, 14 (23%) were colonized with an MDRO at admission, and 13 (22%) had incident colonization during hospitalization. The groin was the most sensitive anatomic site for detecting MDRO colonization, and all but one subject remained colonized for the duration of their hospitalization. Sixty percent of subjects with incident Acinetobacter colonization and 25% of subjects with incident Klebsiella colonization had strains that were related to those isolated from other subjects. Of 60 nondeployed subjects, 5 (8%) were colonized with an MDRO at admission; all had recent healthcare contact, and 1 nondeployed subject had an isolate related to a strain recovered from a deployed subject.nnnCONCLUSIONSnColonization with gram-negative MDROs is common among patients with war-related trauma admitted to a military hospital and also occurs among nondeployed patients with recent healthcare contact. The groin is the most sensitive anatomic site for active surveillance, and spontaneous decolonization is rare.
American Journal of Transplantation | 2009
Robert S. Gaston; B. L. Kasiske; Ann M. Fieberg; Robert E Leduc; F. Cosio; Sita Gourishankar; Philip F. Halloran; Lawrence G. Hunsicker; David Rush; Arthur J. Matas
Death with function causes half of late kidney transplant failures, and cardiovascular disease (CVD) is the most common cause of death in these patients. We examined the use of potentially cardioprotective medications in a prospective observational study at seven transplant centers in the United States and Canada. Among 935 patients, 87% received antihypertensive medications at both 1 and 6 months after transplantation. Similar antihypertensive regimens were used for patients with and without diabetes and CVD, but with wide variability among centers. In contrast, while 44% of patients were on angiotensin converting enzyme inhibitors (ACEI) or angiotensin receptor blockers (ARB) at the time of transplantation, the proportion taking these agents dropped to 12% at month 1, then increased to 24% at 6 months. Fewer than 30% with CVD or diabetes received ACEI/ARB therapy 6 months posttransplant. Aspirin use was uncommon (<40% of patients). Even among those with diabetes and/or CVD, fewer than 60% received aspirin and only half received a statin at 1 and 6 months. This study demonstrates marked variability in the use of cardioprotective medications in kidney transplant recipients, a finding that may reflect, among several possible explanations, clinical uncertainty due the lack of randomized trials for these medications in this population.
Clinical Infectious Diseases | 2010
Helen M. Chun; Ann M. Fieberg; Katherine Huppler Hullsiek; Alan R. Lifson; Nancy F. Crum-Cianflone; Amy C. Weintrob; Anuradha Ganesan; Robert V. Barthel; William P. Bradley; Brian K. Agan; Michael L. Landrum
BACKGROUNDnThe epidemiologic trends of hepatitis B virus (HBV) infection in human immunodeficiency virus (HIV)-infected patients over the past 20 years are largely unknown.nnnMETHODSnPrevalence and risk factors for HBV infection overall, at the time of HIV infection, and after HIV infection were examined in an ongoing observational HIV cohort study. Risk factors for HBV infection at the time of diagnosis of HIV infection were evaluated using logistic regression, and risk of incident HBV infection after diagnosis of HIV infection was evaluated using Cox proportional hazards models.nnnRESULTSnOf the 2769 evaluable participants, 1078 (39%) had HBV infection, of whom 117 (11%) had chronic HBV infection. The yearly cross-sectional prevalence of HBV infection decreased from a peak of 49% in 1995 to 36% in 2008 (P < .001). The prevalence of HBV infection at the time of diagnosis of HIV infection decreased during 1989-2008 from 34% to 9% (P < .001). The incidence of HBV infection after diagnosis of HIV infection decreased from 4.0 cases per 100 person-years during the pre-highly active antiretroviral therapy (HAART) era to 1.1 cases per 100 person-years during the HAART era (P < .001); however, this incidence remained unchanged during 2000-2008 (P = .49), with >20% of HBV infections occurring after HIV infection being chronic. Decreased risk of HBV infection after diagnosis of HIV infection was associated with higher CD4 cell count and the use of HBV-active HAART. Receipt of 1 dose of HBV vaccine was not associated with reduced risk of HBV infection after diagnosis of HIV infection.nnnCONCLUSIONSnAlthough the burden of HBV infection overall is slowly decreasing among HIV-infected individuals, the persistent rate of HBV infection after diagnosis of HIV infection raises concern that more-effective prevention strategies may be needed to significantly reduce the prevalence of HBV infection in this patient population.
Journal of Acquired Immune Deficiency Syndromes | 2008
Amy C. Weintrob; Ann M. Fieberg; Brian K. Agan; Anuradha Ganesan; Nancy F. Crum-Cianflone; Vincent C. Marconi; Mollie P. Roediger; Susan Fraser; Scott Wegner; Glenn W. Wortmann
Background:Studies evaluating the effect of age on response to highly active antiretroviral therapy (HAART) have been limited by their inability to control for duration of human immunodeficiency virus (HIV) infection. We examined the effect of age at HIV seroconversion on response to HAART. Methods:A retrospective analysis of a longitudinal US military cohort of HIV-infected subjects. Time to and maintenance of viral suppression, rate of CD4 cell increase, and rate of progression to acquired immunodeficiency syndrome or death were compared across age groups using time-to-event methods. Results:Five hundred sixty-three HIV-infected adults who seroconverted after January 1, 1996, and started HAART were included. Increasing age at seroconversion was significantly associated with faster time to viral suppression (P = 0.002). Increasing age also correlated with duration of suppression, with a 35% reduction in risk of viral rebound for every 5-year increase in age above 18 years (hazard ratio: 0.65, 95% confidence interval 0.55 to 0.75). The rate of CD4 cell increase from 6 to 84 months post-HAART was significantly greater in those who seroconverted at older ages (P = 0.0002). Rates of progression to acquired immunodeficiency syndrome or death did not differ between groups. Conclusions:Increasing age at seroconversion was associated with shorter time to and longer maintenance of viral suppression and a faster increase in CD4 cell count.
AIDS | 2010
Michael L. Landrum; Katherine Huppler Hullsiek; Anuradha Ganesan; Amy C. Weintrob; Nancy F. Crum-Cianflone; Robert V. Barthel; Robert J. O'Connell; Ann M. Fieberg; Helen M. Chun; Vincent C. Marconi; Matthew J. Dolan; Brian K. Agan
Objective:To assess the association of hepatitis B virus (HBV) vaccination with risk of HBV infection among HIV-infected patients and HBV infection risk factors among vaccinees. Design:Observational cohort study. Methods:Participants enrolled from 1986 through 2004, unvaccinated and serologically negative for HBV infection at the time of HIV diagnosis, were followed longitudinally through 2007 for the occurrence of HBV infection. Risk factors for HBV infection were evaluated using time to event methods, including Kaplan–Meier survival curves and Cox proportional hazards models. Results:During 11 632 person-years of follow-up, the rate of HBV infection was 2.01 (95% CI 1.75–2.27)/100 person-years. Receipt of at least one dose of vaccine was not associated with reduced risk of HBV (unadjusted hazard ratio 0.86, 95% CI 0.7–1.1; adjusted hazard ratio 1.08, 95% CI 0.8–1.4). Receipt of three or more doses of vaccine was also not associated with reduced risk (hazard ratio 0.96; 95% CI 0.56–1.64). Among 409 vaccinees with HBsAb less than 10 IU/l, 46 (11.2%) developed HBV infection compared with 11 of 217 (5.1%) vaccinees with HBsAb ≥10 IU/l (hazard ratio 0.51; 95% CI 0.3–1.0). In participants with initial HBsAb less than 10 IU/l, 16 of 46 (35%) infections were chronic, compared with none of 11 in those with initial HBsAb at least 10 IU/l (P = 0.02). Conclusion:Overall, HBV vaccination was not associated with reduced risk of HBV infection in our cohort of HIV-infected individuals. However, the small subset of vaccinees with a positive vaccine response may have had reduced HBV infection risk, including chronic disease. Improvements in vaccine delivery and immunogenicity are needed to increase HBV vaccine effectiveness in HIV-infected patients.
American Heart Journal | 2012
William T. Abraham; Angel R. Leon; Martin St. John Sutton; Steven J. Keteyian; Ann M. Fieberg; Ed Chinchoy; Garrie J. Haas
BACKGROUNDnCardiac resynchronization therapy (CRT) reduces morbidity and mortality and improves symptoms in patients with systolic heart failure (HF) and ventricular dyssynchrony. This randomized, double-blind, controlled study evaluated whether optimizing the interventricular stimulating interval (V-V) to sequentially activate the ventricles is clinically better than simultaneous V-V stimulation during CRT.nnnMETHODSnPatients with New York Heart Association (NYHA) III or IV HF, meeting both CRT and implantable cardioverter-defibrillator indications, randomly received either simultaneous CRT or CRT with optimized V-V settings for 6 months. Patients also underwent echocardiography-guided atrioventricular delay optimization to maximize left ventricular filling. The V-V optimization involved minimizing the left ventricular septal to posterior wall motion delay during CRT. The primary objective was to demonstrate noninferiority using a clinical composite end point that included mortality, HF hospitalization, NYHA functional class, and patient global assessment. Secondary end points included changes in NYHA classification, 6-minute hall walk distance, quality of life, peak VO(2), and event-free survival.nnnRESULTSnThe composite score improved in 75 (64.7%) of 116 simultaneous patients and in 92 (75.4%) of 122 optimized patients (P < .001, for noninferiority). A prespecified test of superiority showed that more optimized patients improved (P = .03). New York Heart Association functional class improved in 58.0% of simultaneous patients versus 75.0% of optimized patients (P = .01). No significant differences in exercise capacity, quality of life, peak VO(2), or HF-related event rate between the 2 groups were observed.nnnCONCLUSIONSnThese findings demonstrate modest clinical benefit with optimized sequential V-V stimulation during CRT in patients with NYHA class III and IV HF. Optimizing V-V timing may provide an additional tool for increasing the proportion of patients who respond to CRT.
PLOS ONE | 2010
Michael L. Landrum; Ann M. Fieberg; Helen M. Chun; Nancy F. Crum-Cianflone; Vincent C. Marconi; Amy C. Weintrob; Anuradha Ganesan; Robert V. Barthel; Glenn Wortmann; Brian K. Agan
Background Factors associated with serologic hepatitis B virus (HBV) outcomes in HIV-infected individuals remain incompletely understood, yet such knowledge may lead to improvements in the prevention and treatment of chronic HBV infection. Methods and Findings HBV-HIV co-infected cohort participants were retrospectively analyzed. HBV serologic outcomes were classified as chronic, resolved, and isolated-HBcAb. Chronic HBV (CHBV) was defined as the presence of HBsAg on two or more occasions at least six months apart. Risk factors for HBV serologic outcome were assessed using logistic regression. Of 2037 participants with HBV infection, 281 (14%) had CHBV. Overall the proportions of HBV infections classified as CHBV were 11%, 16%, and 19% for CD4 cell count strata of ≥500, 200–499, and <200, respectively (p<0.0001). Risk of CHBV was increased for those with HBV infection occurring after HIV diagnosis (OR 2.62; 95% CI 1.78–3.85). This included the subset with CD4 count ≥500 cells/µL where 21% of those with HBV after HIV diagnosis had CHBV compared with 9% for all other cases of HBV infection in this stratum (pu200a=u200a0.0004). Prior receipt of HAART was associated with improved HBV serologic outcome overall (pu200a=u200a0.012), and specifically among those with HBV after HIV (pu200a=u200a0.002). In those with HBV after HIV, HAART was associated with reduced risk of CHBV overall (OR 0.18; 95% CI 0.04–0.79); including reduced risk in the subsets with CD4 ≥350 cells/µL (p<0.001) and CD4 ≥500 cells/µL (pu200a=u200a0.01) where no cases of CHBV were seen in those with a recent history of HAART use. Conclusions Clinical indicators of immunologic status in HIV-infected individuals, such as CD4 cell count, are associated with HBV serologic outcome. These data suggest that immunologic preservation through the increased use of HAART to improve functional anti-HBV immunity, whether by improved access to care or earlier initiation of therapy, would likely improve HBV infection outcomes in HIV-infected individuals.