Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jane Gralla is active.

Publication


Featured researches published by Jane Gralla.


Transplantation | 2011

Inferior kidney allograft outcomes in patients with de novo donor-specific antibodies are due to acute rejection episodes.

James E. Cooper; Jane Gralla; Linda R. Cagle; Ryan J. Goldberg; L Chan; Alexander C. Wiseman

Background. Donor-specific antibodies (DSAs) after kidney transplantation have been associated with poor graft outcomes in multiple studies. However, these studies have generally used stored sera or a single cross sectional screening test to identify patients with DSA. We evaluated the effectiveness of a prospective DSA screening protocol in identifying kidney and kidney/pancreas recipients at risk for poor graft outcomes. Methods. From September 2007 through September 2009, 244 consecutively transplanted kidney and kidney/pancreas recipients without pretransplant DSA were screened for de novo DSA at 1, 6, 12, and 24 months and when clinically indicated. Results. DSA was detected in 27% of all patients by protocol or indication screening. Patients with DSA (DSA+) were significantly more likely to have experienced acute rejection (AR) compared with no DSA (DSA−) (29% vs. 9.5%, P<0.001), and lower estimated 2-year graft survival (83% vs. 98%, P<0.001). Only 3 of 19 DSA (+) patients with AR had DSA detected before the AR episode. When excluding patients with AR, 2-year graft survival was similar between DSA (+) and DSA (−) patients (100% vs. 99%) as was estimated glomerular filtration rate. Patients with DSA detected by protocol screening had similar outcomes compared with DSA (−), whereas those with DSA detected by indication experienced significantly worse outcomes. Conclusions. Patients with de novo DSA experience worse graft outcomes due to previous/concurrent episodes of AR. A prospective DSA screening protocol failed to identify patients at risk for AR or poor short-term graft outcomes.


Journal of Pediatric Gastroenterology and Nutrition | 2005

Evaluation of infantile acid and nonacid gastroesophageal reflux using combined pH monitoring and impedance measurement.

Adria A. Condino; Judith M. Sondheimer; Zhaoxing Pan; Jane Gralla; Darryl Perry; Judith A. O'connor

Objective: Characterize the proportion of acid and nonacid esophageal reflux events in young infants with suspected gastroesophageal reflux (GER) using combined pH-multichannel intraluminal impedance (MII) monitoring. Determine the symptom index correlation with nonacid reflux and acid reflux events. Study Design: Prospective study of children, aged 2 weeks to 1 year, referred to The Childrens Hospital of Denver Gastroenterology Clinic for evaluation of GER. Exclusion criteria were congenital anomalies or syndromes, cerebral palsy, mental retardation, and pulmonary or cardiac disease. The children were admitted to The Childrens Hospital General Clinical Research Center for a 20 hour pH-MII study. Acid suppression was either never used or discontinued 2 weeks before testing. Results: Thirty-four infants were enrolled from February 2004 to February 2005. Ages ranged from 2 months to 11 months, median = 7 (20 females/14 males). One thousand eight hundred ninety reflux events were detected by MII, and 588 reflux events were detected by pH probe alone. The percent of reflux that was acid was 47% (888 events) versus 53% of (1,002 events) nonacid reflux events. The proportion of nonacid reflux decreased with age (P < 0.0001 by Pearson χ2 test) and with increasing time elapsed from last meal. There were 958 total symptoms evaluated. The most frequently reported symptom was fussiness/pain, which correlated with nonacid reflux events 24.6% and acid reflux 25.2%. The proximal height of a reflux was predictive for symptoms of fussiness/pain, arching, and burping. Conclusion: MII detects more reflux events than pH monitoring alone. The proportion of nonacid reflux to acid reflux events in infants is more similar to adults than previously reported. Combined pH-MII esophageal monitoring identifies more reflux events and improves clinical correlation with symptoms.


Journal of the American Academy of Child and Adolescent Psychiatry | 2011

A Double-Blind, Placebo-Controlled Study of Risperidone for the Treatment of Adolescents and Young Adults with Anorexia Nervosa: A Pilot Study.

Jennifer O. Hagman; Jane Gralla; Eric Sigel; Swan Ellert; Mindy Dodge; Rick M. Gardner; Teri O'Lonergan; Guido K. Frank; Marianne Z. Wamboldt

OBJECTIVE The purpose of this double-blind, placebo-controlled exploratory pilot study was to evaluate the safety and efficacy of risperidone for the treatment of anorexia nervosa. METHOD Forty female subjects 12 to 21 years of age (mean, 16 years) with primary anorexia nervosa in an eating disorders program were randomized to receive risperidone (n = 18) or placebo (n = 22). Subjects completed the Eating Disorder Inventory 2, Color-A-Person Test, Body Image Software, and Multidimensional Anxiety Scale for Children at baseline and regular intervals. Weight, laboratory values, and electrocardiograms were monitored. Study medication was started at 0.5 mg daily and titrated upward weekly in 0.5-mg increments to a maximum dose of 4 mg until the subject reached a study endpoint. RESULTS The mean dose for the risperidone group was 2.5 mg and for the placebo group was 3 mg for a mean duration of 9 weeks. Subjects taking risperidone had a significant decrease on the Eating Disorder Inventory 2 Drive for Thinness subscale over the first 7 weeks (effect size, 0.88; p = .002), but this difference was not sustained to the end of the study (p = .13). The Eating Disorder Inventory 2 Interpersonal Distrust subscale decreased significantly more in subjects taking risperidone (effect size, 0.60; p = .03). Subjects taking risperidone had increased prolactin levels (week 7; p = .001). There were no significant differences between groups at baseline or the end of the study for the other rating scales, change in weight, or laboratory measurements. CONCLUSIONS This study does not demonstrate a benefit for the addition of risperidone in adolescents with anorexia nervosa during the weight-restoration phase of care. Clinical trial registration information-A Double-Blind, Placebo-Controlled Study of Risperidone for the Treatment of Anorexia Nervosa, http://www.clinicaltrials.gov, NCT00140426.


Clinical Journal of The American Society of Nephrology | 2008

Aggressive Immunosuppression Minimization Reduces Graft Loss Following Diagnosis of BK Virus-Associated Nephropathy: A Comparison of Two Reduction Strategies

Andrew S. Weiss; Jane Gralla; Larry Chan; Patrick Klem; Alexander C. Wiseman

BACKGROUND AND OBJECTIVES BK virus-associated nephropathy (BKVAN) has emerged as a leading cause of kidney graft loss, with no known predictors for graft loss and no consensus regarding treatment other than reduction of immunosuppression. DESIGN, SETTING, PARTICIPANTS AND MEASUREMENTS A single-center retrospective analysis was performed of all cases of BKVAN from 1999 to 2005 for clinical predictors of graft loss, with evaluation of the impact of immunosuppression withdrawal (3-drug to 2-drug immunosuppression) within the first month versus reduction of immunosuppression. RESULTS Of 910 kidney transplants, 35 (3.8%) cases of BKVAN were diagnosed at a median of 15 months after transplant (range, 5.5 to 90 months after transplant), 16 (46%) of which progressed to graft failure at a median of 11 months (range, 2 to 36 months) after diagnosis. Depleting antibody induction was a significant risk factor for graft loss on univariate analysis, whereas early drug withdrawal (<1 mo following diagnosis) protected against graft loss. On multivariate analysis, these findings were independent predictors of graft outcomes. Additionally, when patients were comanaged by referring nephrologists and the transplant center before the diagnosis of BKVAN, the risk of graft loss was 11-fold higher (P = 0.03) than if patients were managed solely by the transplant center. CONCLUSIONS Increased awareness and early diagnosis of BKVAN, with aggressive tapering of immunosuppression once established, is critical to preserve kidney graft function. Early drug withdrawal to low-dose two-drug therapy maintenance may be preferable to a general reduction of agents.


Liver Transplantation | 2012

Projected future increase in aging hepatitis C virus–infected liver transplant candidates: A potential effect of hepatocellular carcinoma

Scott W. Biggins; Kiran Bambha; Norah A. Terrault; John M. Inadomi; Stephen Shiboski; Jennifer L. Dodge; Jane Gralla; Hugo R. Rosen; John P. Roberts

In the United States, the peak hepatitis C virus (HCV) antibody prevalence of 4% occurred in persons born in the calendar years 1940‐1965. The goal of this study was to examine observed and projected age‐specific trends in the demand for liver transplantation (LT) among patients with HCV‐associated liver disease stratified by concurrent hepatocellular carcinoma (HCC). All new adult LT candidates registered with the Organ Procurement and Transplantation Network for LT between 1995 and 2010 were identified. Patients who had primary, secondary, or text field diagnoses of HCV with or without HCC were identified. There were 126,862 new primary registrants for LT, and 52,540 (41%) had HCV. The number of new registrants with HCV dramatically differed by the age at calendar year, and this suggested a birth cohort effect. When the candidates were stratified by birth year in 5‐year intervals, the birth cohorts with the highest frequency of HCV were as follows (in decreasing order): 1951‐1955, 1956‐1960, 1946‐1950, and 1941‐1945. These 4 birth cohorts, spanning from 1941 to 1960, accounted for 81% of all new registrants with HCV. A 4‐fold increase in new registrants with HCV and HCC occurred between the calendar years 2000 and 2010 in the 1941‐1960 birth cohorts. By 2015, we anticipate that an increasing proportion of new registrants with HCV will have HCC and be ≥60 years old (born in or before 1955). In conclusion, the greatest demand for LT due to HCV‐associated liver disease is occurring among individuals born between 1941 and 1960. This demand appears to be driven by the development of HCC in patients with HCV. During the coming decade, the projected increase in the demand for LT from an aging HCV‐infected population will challenge the transplant community to reconsider current treatment paradigms. Liver Transpl, 2012.


The Journal of Pediatrics | 2010

Efficacy and Safety of a High Protein, Low Carbohydrate Diet for Weight Loss in Severely Obese Adolescents

Nancy F. Krebs; Dexiang Gao; Jane Gralla; Juliet Collins; Susan L. Johnson

OBJECTIVE To evaluate the efficacy and safety of a carbohydrate restricted versus a low fat diet on weight loss, metabolic markers, body composition, and cardiac function tests in severely obese adolescents. STUDY DESIGN Subjects were randomly assigned to 1 of 2 diets: a high protein, low carbohydrate (20 g/d) diet (high protein, low carbohydrate, HPLC) or low fat (30% of calories) regimen for 13 weeks; close monitoring was maintained to evaluate safety. After the intervention, no clinical contact was made until follow-up measurements were obtained at 24 and 36 weeks from baseline. The primary outcome was change in body mass index Z-score for age and sex (BMI-Z) at 13, 24, and 36 weeks. RESULTS Forty-six subjects (24 HPLC, 22 in low fat) initiated and 33 subjects completed the intervention; follow-up data were available on approximately half of the subjects. Significant reduction in (BMI-Z) was achieved in both groups during intervention and was significantly greater for the HPLC group (P = .03). Both groups maintained significant BMI-Z reduction at follow-up; changes were not significantly different between groups. Loss of lean body mass was not spared in the HPLC group. No serious adverse effects were observed related to metabolic profiles, cardiac function, or subjective complaints. CONCLUSIONS The HPLC diet is a safe and effective option for medically supervised weight loss in severely obese adolescents.


Hepatology | 2007

Subcutaneous vitamin E ameliorates liver injury in an in vivo model of steatocholestasis

Jason S. Soden; Michael W. Devereaux; Joel E. Haas; Eric Gumpricht; Rolf Dahl; Jane Gralla; Maret G. Traber; Ronald J. Sokol

Several genetic metabolic liver diseases share the pathological features of combined steatosis and cholestasis, or steatocholestasis. The aims of this study were to develop and characterize an in vivo model for steatocholestasis and to evaluate the effects of an antioxidant treatment on liver injury, oxidative stress, and mitochondrial perturbations in this model. Obese and lean Zucker rats received intravenous (IV) injections of glycochenodeoxycholic acid (GCDC) and were killed 4 hours later. Liver enzymes were measured; the liver histology was assessed, and hepatic mitochondria were analyzed for mitochondrial lipid peroxidation. In separate experiments, rats received daily injections of subcutaneous (SQ) vitamin E before GCDC infusion. Bile acid–induced injury (serum AST and ALT and liver histology) was more severe in the obese rats than in the lean rats, characterized predominantly by extensive cell necrosis with minimal evidence of apoptosis. SQ vitamin E provided significant protection against IV GCDC‐induced hepatic injury, in vitro GCDC‐induced permeability transition, and cytochrome C and apoptosis‐inducing factor release from isolated mitochondria. Conclusion: Steatosis sensitizes the liver to bile acid–induced necrotic hepatocyte injury, which is responsive to vitamin E therapy. (HEPATOLOGY 2007.)


Hepatology | 2005

Comparison of indices of vitamin A status in children with chronic liver disease

Andrew P. Feranchak; Jane Gralla; Robert King; Rebecca O. Ramirez; Mary E. Corkill; Michael R. Narkewicz; Ronald J. Sokol

Malabsorption of fat‐soluble vitamins is a major complication of chronic cholestatic liver disease. The most accurate way to assess vitamin A status in children who have cholestasis is unknown. The goal of this study was to assess the accuracy of noninvasive tests to detect vitamin A deficiency. Children with chronic cholestatic liver disease (n = 23) and noncholestatic liver disease (n = 10) were studied. Ten cholestatic patients were identified as vitamin A–deficient based on the relative dose response (RDR). Compared with the RDR, the sensitivity and specificity to detect vitamin A deficiency for each test was, respectively: serum retinol, 90% and 78%; retinol‐binding protein (RBP), 40% and 91%; retinol/RBP molar ratio, 60% and 74%; conjunctival impression cytology, 44% and 48%; slit‐lamp examination, 20% and 66%; tear film break‐up time, 40% and 69%; and Schirmers test, 20% and 78%. We developed a modified oral RDR via oral coadministration of d‐alpha tocopheryl polyethylene glycol‐1000 succinate and retinyl palmitate. This test had a sensitivity of 80% and a specificity of 100% to detect vitamin A deficiency. In conclusion, vitamin A deficiency is relatively common in children who have chronic cholestatic liver disease. Our data suggest that serum retinol level as an initial screen followed by confirmation with a modified oral RDR test is the most effective means of identifying vitamin A deficiency in these subjects. (HEPATOLOGY 2005;42:782–792.)


Transplantation | 2013

The impact of human leukocyte antigen mismatching on sensitization rates and subsequent retransplantation after first graft failure in pediatric renal transplant recipients.

Jane Gralla; Suhong Tong; Alexander C. Wiseman

Background U.S. allocation policies currently place less emphasis on human leukocyte antigen (HLA) matching in pediatric kidney transplant candidates to minimize dialysis time. The impact this may have on pediatric recipients after graft failure has not been extensively examined. Methods Using the Scientific Registry of Transplant Recipients database, we examined HLA sensitization after graft loss and regraft survival of all pediatric primary kidney transplant recipients younger than 18 years transplanted between 1990 and 2008, stratified by HLA-DR mismatch (MM) of first and second kidney transplant. Results Of 11,916 pediatric primary kidney transplant recipients, 2704 were relisted after first graft failure. 1847 received a retransplants, and 857 remained on the waiting list. Mean % panel reactive antibody increased from 6% to 45% for retransplant and from 8% to 76% for those on the waiting list. The degree of sensitization and waiting time to retransplantation increased with DR MM at first kidney transplantation. Two DR MM statuses at first transplant were associated with a 20% reduction in the hazard of receiving a retransplant (hazard ratio, 0.80 for 2 vs. 0–1 DR MM; P<0.001). Five-year retransplant graft survival was associated with the number of HLA MM at first and second kidney transplant. Retransplant graft survival was similar in the circumstance of a 0–1 DR MM living donor following a deceased donor, and the converse. Conclusion In pediatric recipients, increasing number of initial HLA-DR MMs is associated with HLA sensitization, longer waiting time, decreased rate of retransplant, and decreased regraft survival. Consideration of DR matching at first transplant may mitigate these risks.


Clinical Journal of The American Society of Nephrology | 2012

Simultaneous Pancreas Kidney Transplant versus Other Kidney Transplant Options in Patients with Type 2 Diabetes

Alexander C. Wiseman; Jane Gralla

BACKGROUND AND OBJECTIVES Current organ allocation policy prioritizes placement of kidneys (with pancreas) to patients listed for simultaneous pancreas-kidney transplantation (SPK). Patients with type 2 diabetes mellitus (T2DM) may undergo SPK, but it is unknown whether these patients enjoy a survival advantage with SPK versus deceased-donor kidney transplantation alone (DDKA) or living-donor kidney transplantation alone (LDKA). DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS Using the Scientific Registry of Transplant Recipients database, patients with T2DM, age 18-59 years, body mass index 18-30 kg/m(2), who underwent SPK, DDKA, or LDKA from 2000 through 2008 were identified. Five-year patient and kidney graft survival rates were compared, and multivariable analysis was performed to determine donor, recipient, and transplant factors influencing these outcomes. RESULTS Of 6416 patients identified, 4005, 1987, and 424 underwent DDKA, LDKA, and SPK, respectively. On unadjusted analysis, patient and kidney graft survival rates were superior for LDKA versus SPK, whereas patient but not graft survival was higher for SPK versus DDKA. On multivariable analysis, survival advantage for SPK versus DDKA was related not to pancreas transplantation but younger donor and recipient ages in the SPK cohort. CONCLUSIONS Good outcomes can occur with SPK in selected patients with T2DM, but no patient or graft survival advantage is provided by added pancreas transplantation compared with DDKA; outcomes were superior with LDKA. These results support cautious use of SPK in T2DM when LDKA is not an option, with close oversight of the effect of kidney (with pancreas) allocation priority over other transplant candidates.

Collaboration


Dive into the Jane Gralla's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Scott W. Biggins

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Kiran Bambha

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

James E. Cooper

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Hugo R. Rosen

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Patrick Klem

Anschutz Medical Campus

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Scott R. Auerbach

Boston Children's Hospital

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge