Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pratima Sharma is active.

Publication


Featured researches published by Pratima Sharma.


American Journal of Transplantation | 2009

Survival Benefit‐Based Deceased‐Donor Liver Allocation

Douglas E. Schaubel; Mary K. Guidinger; Scott W. Biggins; John D. Kalbfleisch; Elizabeth A. Pomfret; Pratima Sharma; Robert M. Merion

Currently, patients awaiting deceased‐donor liver transplantation are prioritized by medical urgency. Specifically, wait‐listed chronic liver failure patients are sequenced in decreasing order of Model for End‐stage Liver Disease (MELD) score. To maximize lifetime gained through liver transplantation, posttransplant survival should be considered in prioritizing liver waiting list candidates. We evaluate a survival benefit based system for allocating deceased‐donor livers to chronic liver failure patients. Under the proposed system, at the time of offer, the transplant survival benefit score would be computed for each patient active on the waiting list. The proposed score is based on the difference in 5‐year mean lifetime (with vs. without a liver transplant) and accounts for patient and donor characteristics. The rank correlation between benefit score and MELD score is 0.67. There is great overlap in the distribution of benefit scores across MELD categories, since waiting list mortality is significantly affected by several factors. Simulation results indicate that over 2000 life‐years would be saved per year if benefit‐based allocation was implemented. The shortage of donor livers increases the need to maximize the life‐saving capacity of procured livers. Allocation of deceased‐donor livers to chronic liver failure patients would be improved by prioritizing patients by transplant survival benefit.


Gastroenterology | 2008

Re-weighting the Model for End-Stage Liver Disease Score Components

Pratima Sharma; Douglas E. Schaubel; Camelia S. Sima; Robert M. Merion; Anna S. Lok

BACKGROUND & AIMS Liver transplant candidates with mild hepatic synthetic dysfunction and marked renal insufficiency may have higher Model for End-Stage Liver Disease (MELD) scores than candidates with severe liver disease and normal renal function. We re-estimated MELD coefficients and evaluated the effect of updated MELD on the liver transplant waiting list ranking. METHODS Scientific Registry of Transplant Recipients data was analyzed for 38,899 adults wait-listed between September, 2001 and December, 2006. A time-dependent Cox regression waiting list mortality model estimated updated MELD component coefficients. Rank correlation between existing and updated MELD scores was computed. RESULTS Existing MELD component coefficient (log(e) creatinine, 0.957 vs 1.266 [95% confidence interval (CI), 1.21-1.32]; log(e) bilirubin, 0.378 vs 0.939 [95% CI, 0.91-0.97]; log(e) international normalized ratio, 1.120 vs 1.658 [95% CI, 1.58-1.74]) was significantly different than updated counterpart. Index of concordance was higher for updated MELD than existing MELD for predicting overall (0.68 vs. 0.64) and 90-day waiting list mortality (0.77 vs. 0.75). Rank correlation between existing and updated MELD scores was 0.95 for all candidates and 0.72 for candidates with existing MELD >or=20. Among candidates with equal existing MELD, those with lower creatinine and higher bilirubin had significantly higher waiting list mortality. CONCLUSIONS Existing MELD coefficient components are significantly different than those calculated from national waiting list data. Updated MELD assigns lower weight to creatinine and international normalized ratio and higher weight to bilirubin. Updated MELD better predicts waiting list mortality. Using updated MELD for liver allocation would alter waiting list candidate ranking.


Liver Transplantation | 2009

Renal outcomes after liver transplantation in the model for end-stage liver disease era.

Pratima Sharma; Kathy Welch; Richard Eikstadt; Jorge A. Marrero; Robert J. Fontana; Anna S. Lok

The proportion of patients undergoing liver transplantation (LT) with renal insufficiency has significantly increased in the Model for End‐Stage Liver Disease (MELD) era. This study was designed to determine the incidence and predictors of post‐LT chronic renal failure (CRF) and its effect on patient survival in the MELD era. Outcomes of 221 adult LT recipients who had LT between February 2002 and February 2007 were reviewed retrospectively. Patients who were listed as status 1, were granted a MELD exception, or had living‐donor, multiorgan LT were excluded. Renal insufficiency at LT was defined as none to mild [estimated glomerular filtration rate (GFR) ≥ 60 mL/minute], moderate (30–59 mL/minute), or severe (<30 mL/minute). Post‐LT CRF was defined as an estimated GFR < 30 mL/minute persisting for 3 months, initiation of renal replacement therapy, or listing for renal transplantation. The median age was 54 years, 66% were male, 89% were Caucasian, and 43% had hepatitis C. At LT, the median MELD score was 20, and 6.3% were on renal replacement therapy. After a median follow‐up of 2.6 years (range, 0.01–5.99), 31 patients developed CRF with a 5‐year cumulative incidence of 22%. GFR at LT was the only independent predictor of post‐LT CRF (hazard ratio = 1.33, P < 0.001). The overall post‐LT patient survival was 74% at 5 years. Patients with MELD ≥ 20 at LT had a higher cumulative incidence of post‐LT CRF in comparison with patients with MELD < 20 (P = 0.03). A decrease in post‐LT GFR over time was the only independent predictor of survival. In conclusion, post‐LT CRF is common in the MELD era with a 5‐year cumulative incidence of 22%. Low GFR at LT was predictive of post‐LT CRF, and a decrease in post‐LT GFR over time was associated with decreased post‐LT survival. Further studies of modifiable preoperative, perioperative, and postoperative factors influencing renal function are needed to improve outcomes following LT. Liver Transpl 15:1142–1148, 2009.


Liver Transplantation | 2007

Sustained virologic response to therapy of recurrent hepatitis C after liver transplantation is related to early virologic response and dose adherence

Pratima Sharma; Jorge A. Marrero; Robert J. Fontana; Joel K. Greenson; Hari S. Conjeevaram; Grace L. Su; Frederick K. Askari; Patricia Sullivan; Anna S. Lok

Sustained virologic response (SVR) after antiviral therapy for recurrent hepatitis C virus (HCV) infection in liver transplant (LT) recipients is consistently lower than that achieved in non‐LT patients. We evaluated efficacy and safety of pegylated interferon (IFN) and ribavirin (RBV) therapy in LT recipients with recurrent HCV and factors associated with SVR. All subjects with histologic evidence of recurrent HCV were intended to be treated for 48 weeks with full‐dose pegylated IFN; target dose of RBV was 800 mg/day. Thirty‐five LT recipients with recurrent HCV, median age 48.5 years, 77% genotype 1, and median pretreatment HCV RNA 6.4 log10 IU/mL were treated between January 2000 and February 2006. Antiviral therapy was discontinued prematurely in 15 subjects as a result of adverse events. Median overall treatment duration was 46 weeks. Early virologic response at week 12 was seen in 17 (49%) and an end‐of‐treatment virological response in 19 (54%) patients. SVR was achieved in 13 patients (37%), and all 9 patients followed for >1 year after treatment had durable response. Patients with SVR had significantly lower pretreatment HCV RNA (5.7 vs. 6.5 log10 IU/mL, P = 0.003), more likely to have a week 12 virological response (85% vs. 27%, P = 0.0009) and received higher cumulative doses of pegylated IFN (75% vs. 33%, P = 0.029) and RBV (90% vs. 26%, P = 0.016) compared with patients whose disease did not respond to therapy. In conclusion, SVR was achieved in 37% of patients with recurrent hepatitis C after LT. Similar to non‐LT patients, those with lower pretreatment HCV RNA, a week 12 virological response, and pegylated IFN and RBV dose adherence were more likely to achieve SVR. Liver Transpl, 2007.


American Journal of Transplantation | 2011

Impact of MELD-Based Allocation on End-Stage Renal Disease after Liver Transplantation

Pratima Sharma; Douglas E. Schaubel; Mary K. Guidinger; Nathan P. Goodrich; A. O. Ojo; Robert M. Merion

The proportion of patients undergoing liver transplantation (LT), with concomitant renal dysfunction, markedly increased after allocation by the model for end‐stage liver disease (MELD) score was introduced. We examined the incidence of subsequent post‐LT end‐stage renal disease (ESRD) before and after the policy was implemented. Data on all adult deceased donor LT recipients between April 27, 1995 and December 31, 2008 (n = 59 242), from the Scientific Registry of Transplant Recipients, were linked with Centers for Medicare & Medicaid Services’ ESRD data. Cox regression was used to (i) compare pre‐MELD and MELD eras with respect to post‐LT ESRD incidence, (ii) determine the risk factors for post‐LT ESRD and (iii) quantify the association between ESRD incidence and mortality. Crude rates of post‐LT ESRD were 12.8 and 14.5 per 1000 patient‐years in the pre‐MELD and MELD eras, respectively. Covariate‐adjusted post‐LT ESRD risk was higher in the MELD era (hazard ratio [HR]= 1.15; p = 0.0049). African American race, hepatitis C, pre‐LT diabetes, higher creatinine, lower albumin, lower bilirubin and sodium >141 mmol/L at LT were also significant predictors of post‐LT ESRD. Post‐LT ESRD was associated with higher post‐LT mortality (HR = 3.32; p < 0.0001). The risk of post‐LT ESRD, a strong predictor of post‐LT mortality, is 15% higher in the MELD era. This study identified potentially modifiable risk factors of post‐LT ESRD. Early intervention and modification of these risk factors may reduce the burden of post‐LT ESRD.


Hepatology | 2012

End-stage liver disease candidates at the highest model for end-stage liver disease scores have higher wait-list mortality than status-1A candidates†‡§¶‖

Pratima Sharma; Douglas E. Schaubel; Qi Gong; Mary K. Guidinger; Robert M. Merion

Candidates with fulminant hepatic failure (Status‐1A) receive the highest priority for liver transplantation (LT) in the United States. However, no studies have compared wait‐list mortality risk among end‐stage liver disease (ESLD) candidates with high Model for End‐Stage Liver Disease (MELD) scores to those listed as Status‐1A. We aimed to determine if there are MELD scores for ESLD candidates at which their wait‐list mortality risk is higher than that of Status‐1A, and to identify the factors predicting wait‐list mortality among those who are Status‐1A. Data were obtained from the Scientific Registry of Transplant Recipients for adult LT candidates (n = 52,459) listed between September 1, 2001, and December 31, 2007. Candidates listed for repeat LT as Status‐1 A were excluded. Starting from the date of wait listing, candidates were followed for 14 days or until the earliest occurrence of death, transplant, or granting of an exception MELD score. ESLD candidates were categorized by MELD score, with a separate category for those with calculated MELD > 40. We compared wait‐list mortality between each MELD category and Status‐1A (reference) using time‐dependent Cox regression. ESLD candidates with MELD > 40 had almost twice the wait‐list mortality risk of Status‐1A candidates, with a covariate‐adjusted hazard ratio of HR = 1.96 (P = 0.004). There was no difference in wait‐list mortality risk for candidates with MELD 36‐40 and Status‐1A, whereas candidates with MELD < 36 had significantly lower mortality risk than Status‐1A candidates. MELD score did not significantly predict wait‐list mortality among Status‐1A candidates (P = 0.18). Among Status‐1A candidates with acetaminophen toxicity, MELD was a significant predictor of wait‐list mortality (P < 0.0009). Posttransplant survival was similar for Status‐1A and ESLD candidates with MELD > 20 (P = 0.6). Conclusion: Candidates with MELD > 40 have significantly higher wait‐list mortality and similar posttransplant survival as candidates who are Status‐1A, and therefore, should be assigned higher priority than Status‐1A for allocation. Because ESLD candidates with MELD 36‐40 and Status‐1A have similar wait‐list mortality risk and posttransplant survival, these candidates should be assigned similar rather than sequential priority for deceased donor LT. (Hepatology 2012)


American Journal of Transplantation | 2006

Reduced priority MELD score for hepatocellular carcinoma does not adversely impact candidate survival awaiting liver transplantation

Pratima Sharma; Ann M. Harper; Jose L. Hernandez; Thomas G. Heffron; David C. Mulligan; Russell H. Wiesner; Vijayan Balan

The liver organ allocation policy of the United Network for Organ Sharing (UNOS) is based on the model for end‐stage liver disease (MELD). The policy provides additional priority for candidates with hepatocellular carcinoma (HCC) who are awaiting deceased donor liver transplantation (DDLT). However, this priority was reduced on February 27, 2003 to a MELD of 20 for stage T1 and of 24 for stage T2 HCC. The aim of this study was to determine the impact of reduced priority on HCC candidate survival while on the waiting list. The UNOS database was reviewed for all HCC candidates listed after February 27, 2002, The HCC candidates were grouped into two time periods: MELD 1 (listed between February 27, 2002, and February 26, 2003) and MELD 2 (listed between February 27, 2003 and February 26, 2004). For the two time periods, the national DDLT incidence rates for HCC patients were 1.44 versus 1.53 DDLT per person‐year (p = NS) and the waiting times were similar for the two periods (138.0 ± 196.8 vs. 129.0 ± 133.8 days; p = NS). Furthermore, the 3‐, 6‐ and 12‐month candidate, patient survival and dropout rates were also similar nationally. Regional differences in rates of DDLT for HCC were observed during both MELD periods. Consequently, the reduced MELD score for stage T1 and T2 HCC candidates awaiting DDLT has not had an impact nationally either on their survival on the waiting list or on their ability to obtain a liver transplant within a reasonable time frame. However, regional variations point to the need for reform in how organs are allocated for HCC at the regional level.


Transplant International | 2011

Evidence-based development of liver allocation: a review.

Robert M. Merion; Pratima Sharma; Douglas E. Schaubel

Liver transplantation has undergone a rapid evolution from a high‐risk experimental procedure to a mainstream therapy for thousands of patients with a wide range of hepatic diseases. Its increasing success has been accompanied by progressive imbalance between organ donor supply and the patients who might benefit. Where demand outstrips supply in transplantation, a system of organ allocation is inevitably required to make the wisest use of the available, but scarce, organs. Early attempts to rationally allocate donor livers were particularly hampered by lack of available and suitable data, leading to imperfect solutions that created or exacerbated inequities in the system. The advent and maturation of evidence‐based predictors of waiting list mortality risk led to more objective criteria for liver allocation, aided by the increasing availability of data on large numbers of patients. Until now, the vast majority of allocation systems for liver transplantation have relied on estimation of waiting list mortality. Evidence‐based allocation systems that incorporate measures of post‐transplant outcomes are conceptually attractive and these transplant benefit‐based allocation systems have been developed, modeled, and subjected to computer simulation. Future implementations of benefit‐based liver allocation await continued refinement and additional debate in the transplant community.


Liver Transplantation | 2009

Effect of pretransplant serum creatinine on the survival benefit of liver transplantation

Pratima Sharma; Douglas E. Schaubel; Mary K. Guidinger; Robert M. Merion

More candidates with creatinine levels ≥ 2 mg/dL have undergone liver transplantation (LT) since the implementation of Model for End‐Stage Liver Disease (MELD)–based allocation. These candidates have higher posttransplant mortality. This study examined the effect of serum creatinine on survival benefit among candidates undergoing LT. Scientific Registry of Transplant Recipients data were analyzed for adult LT candidates listed between September 2001 and December 2006 (n = 38,899). The effect of serum creatinine on survival benefit (contrast between waitlist and post‐LT mortality rates) was assessed by sequential stratification, an extension of Cox regression. At the same MELD score, serum creatinine was inversely associated with survival benefit within certain defined MELD categories. The survival benefit significantly decreased as creatinine increased for candidates with MELD scores of 15 to 17 or 24 to 40 at LT (MELD scores of 15‐17, P < 0.0001; MELD scores of 24‐40, P = 0.04). Renal replacement therapy at LT was also associated with significantly decreased LT benefit for patients with MELD scores of 21 to 23 (P = 0.04) or 24 to 26 (P = 0.01). In conclusion, serum creatinine at LT significantly affects survival benefit for patients with MELD scores of 15 to 17 or 24 to 40. Given the same MELD score, patients with higher creatinine levels receive less benefit on average, and the relative ranking of a large number of wait‐listed candidates with MELD scores of 15 to 17 or 24 to 40 would be markedly affected if these findings were incorporated into the allocation policy. Liver Transpl 15:1808–1813, 2009.


Clinical Journal of The American Society of Nephrology | 2013

Short-Term Pretransplant Renal Replacement Therapy and Renal Nonrecovery after Liver Transplantation Alone

Pratima Sharma; Nathan P. Goodrich; Min Zhang; Mary K. Guidinger; Douglas E. Schaubel; Robert M. Merion

BACKGROUND AND OBJECTIVES Candidates with AKI including hepatorenal syndrome often recover renal function after successful liver transplantation (LT). This study examined the incidence and risk factors associated with renal nonrecovery within 6 months of LT alone among those receiving acute renal replacement therapy (RRT) before LT. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS Scientific Registry of Transplant Recipients data were linked with Centers for Medicare and Medicaid Services ESRD data for 2112 adult deceased-donor LT-alone recipients who received acute RRT for ≤90 days before LT (February 28, 2002 to August 31, 2010). Primary outcome was renal nonrecovery (post-LT ESRD), defined as transition to chronic dialysis or waitlisting or receipt of kidney transplant within 6 months of LT. Cumulative incidence of renal nonrecovery was calculated using competing risk analysis. Cox regression identified recipient and donor predictors of renal nonrecovery. RESULTS The cumulative incidence of renal nonrecovery after LT alone among those receiving the pre-LT acute RRT was 8.9%. Adjusted renal nonrecovery risk increased by 3.6% per day of pre-LT RRT (P<0.001). Age at LT per 5 years (P=0.02), previous-LT (P=0.01), and pre-LT diabetes (P<0.001) were significant risk factors of renal nonrecovery. Twenty-one percent of recipients died within 6 months of LT. Duration of pretransplant RRT did not predict 6-month post-transplant mortality. CONCLUSIONS Among recipients on acute RRT before LT who survived after LT alone, the majority recovered their renal function within 6 months of LT. Longer pre-LT RRT duration, advanced age, diabetes, and re-LT were significantly associated with increased risk of renal nonrecovery.

Collaboration


Dive into the Pratima Sharma's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anna S. Lok

University of Michigan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Grace L. Su

University of Michigan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge