Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Linda W. Jennings is active.

Publication


Featured researches published by Linda W. Jennings.


Gastroenterology | 2010

Aging of hepatitis C virus (HCV)-infected persons in the United States: a multiple cohort model of HCV prevalence and disease progression.

Gary L. Davis; Miriam J. Alter; Hashem B. El–Serag; T. Poynard; Linda W. Jennings

BACKGROUND & AIMS The prevalence of chronic hepatitis C (CH-C) remains high and the complications of infection are common. Our goal was to project the future prevalence of CH-C and its complications. METHODS We developed a multicohort natural history model to overcome limitations of previous models for predicting disease outcomes and benefits of therapy. RESULTS Prevalence of CH-C peaked in 2001 at 3.6 million. Fibrosis progression was inversely related to age at infection, so cirrhosis and its complications were most common after the age of 60 years, regardless of when infection occurred. The proportion of CH-C with cirrhosis is projected to reach 25% in 2010 and 45% in 2030, although the total number with cirrhosis will peak at 1.0 million (30.5% higher than the current level) in 2020 and then decline. Hepatic decompensation and liver cancer will continue to increase for another 10 to 13 years. Treatment of all infected patients in 2010 could reduce risk of cirrhosis, decompensation, cancer, and liver-related deaths by 16%, 42%, 31%, and 36% by 2020, given current response rates to antiviral therapy. CONCLUSIONS Prevalence of hepatitis C cirrhosis and its complications will continue to increase through the next decade and will mostly affect those older than 60 years of age. Current treatment patterns will have little effect on these complications, but wider application of antiviral treatment and better responses with new agents could significantly reduce the impact of this disease in coming years.


Liver Transplantation | 2004

Estimation of glomerular filtration rates before and after orthotopic liver transplantation: evaluation of current equations.

Thomas A. Gonwa; Linda W. Jennings; Martin L. Mai; Paul Stark; Andrew S. Levey; Goran B. Klintmalm

The ability to estimate rather than measure the glomerular filtration rate (GFR) in patients before and after liver transplantation would be helpful in estimating risk, dosing drugs, and assessing long‐term toxicity of calcineurin inhibitors. Currently available equations for estimating the GFR have not been validated in either the pre‐ or post‐liver transplant population. We have evaluated the performance of currently used formulas for the estimation of the GFR in this setting. Data were collected prospectively on patients who underwent liver transplantation between 1984 and 2001. GFR per 1.73 m2 was measured by I125 iothalamate in patients at the pretransplant evaluation and at 3 months, 1 year, and yearly posttransplant thereafter. GFR estimated by the Cockcroft‐Gault equation, the Nankivell equation, and the equations from the Modification of Diet in Renal Disease (MDRD) Study (6, 5, and 4 variables) was compared with the measured GFR. Pretransplant GFR was available in 1,447 patients. The mean GFR was 90.7 ± 40.5 mL/min. Values for r and r2 were highest for the MDRD Study 6‐variable equation (0.70 and 0.49, respectively). Only 66% of estimates were within 30% of the measured GFR. At 3 months, 1 year, and 5 years posttransplant, the mean GFR was 59.5 ± 27.1 mL/min, 62.7 ± 27.8 mL/min, and 55.3 ± 26.1 mL/min, respectively. Values for r and r2 for the MDRD Study 6‐variable equations at 1 and 5 years posttransplant were 0.74 (0.55) and 0.76 (0.58), respectively. At these time points, however, only 67% and 64% of the estimated GFR were within 30% of the measured GFR. MDRD Study equations had greater precision than other equations, but the precision was lower than reported for MDRD estimation of GFR in other populations. Better methods for estimating the GFR are required for evaluation of renal function before and after liver transplantation. (Liver Transpl 2004;10:301–309.)


Journal of Parenteral and Enteral Nutrition | 1995

Early enteral nutrition support in patients undergoing liver transplantation

Jeanette Hasse; Linda S. Blue; George U. Liepa; Robert M. Goldstein; Linda W. Jennings; Eytan Mor; Bo S. Husberg; Marlon F. Levy; Thomas A. Gonwa; Goran B. Klintmalm

BACKGROUND The purpose of this study was to determine the effects of early postoperative tube feeding on outcomes of liver transplant recipients. METHODS Fifty transplant patients were randomized prospectively to receive enteral formula via nasointestinal feeding tubes (tube-feeding [TF] group) or maintenance i.v. fluid until oral diets were initiated (control group). Thirty-one patients completed the study. Resting energy expenditure, nitrogen balance, and grip strength were measured on days 2, 4, 7, and 12 after liver transplantation. Calorie and protein intakes were calculated for 12 days posttransplant. RESULTS Tube feeding was tolerated in the TF group (n = 14). The TF patients had greater cumulative 12-day nutrient intakes (22,464 +/- 3554 kcal, 927 +/- 122 g protein) than did the control patients (15,474 +/- 5265 kcal, 637 +/- 248 g protein) (p < .002). Nitrogen balance was better in the TF group on posttransplant day 4 than in the control group (p < .03). There was a rise in the overall mean resting energy expenditure in the first two posttransplant weeks from 1487 +/- 338 to 1990 +/- 367 kcal (p = .0002). Viral infections occurred in 17.7% of control patients compared with 0% of TF patients (p = .05). Although other infections tended to occur more frequently in the control group vs the TF group (bacterial, 29.4% vs 14.3%; overall infections, 47.1% vs 21.4%), these differences were not statistically significant. Early posttransplant tube feeding did not influence hospitalization costs, hours on the ventilator, lengths of stay in the intensive care unit and hospital, rehospitalizations, or rejection during the first 21 posttransplant days. CONCLUSIONS Early posttransplant tube feeding was tolerated and promoted improvements in some outcomes and should be considered for all liver transplant patients.


Liver Transplantation | 2007

Expanded criteria for Liver Transplantation in patients with hepatocellular carcinoma : A report from the International Registry of Hepatic Tumors in Liver Transplantation

Nicholas Onaca; Gary L. Davis; Robert M. Goldstein; Linda W. Jennings; Goran B. Klintmalm

Hepatocellular carcinoma (HCC) is a common indication for liver transplantation (LT). Currently, deceased donor LT is approved by the United Network for Organ Sharing for patients with HCC who meet the Milan criteria of a single tumor up to 5 cm or up to 3 tumors up to 3 cm as determined by imaging studies. We analyzed data in the International Registry of Hepatic Tumors in Liver Transplantation from 1,206 patients with HCC. Tumor size and number were determined by gross pathologic examination. Kaplan‐Meier recurrence‐free survival in patients with a single tumor ≤5 cm or 2‐3 lesions all ≤3 cm in diameter was 84.7% at 1 year and 61.8% at 5 years. Overall, patients whose tumor or tumors exceeded these limits had worse survival (67.2% at 1 year and 42.8% at 5 years, P < 0.001); however, not all patients in this group did poorly. Patients with 2‐4 tumors ≤5 cm or single lesions ≤6 cm had recurrence‐free survival equivalent to patients with a single tumor of 3.1‐5.0 cm or 2‐3 lesions all ≤3 cm in diameter. These data suggest that current criteria for selecting tumor patients for LT may be too restrictive and could be expanded. Liver Transpl 13:391–399, 2007.


Liver Transplantation | 2009

Acute kidney injury following liver transplantation: Definition and outcome

Yousri M. Barri; Edmund Q. Sanchez; Linda W. Jennings; Larry Melton; Steven R. Hays; Marlon F. Levy; Goran B. Klintmalm

The incidence of acute kidney injury (AKI) has been reported to vary between 17% and 95% post–orthotopic liver transplantation. This variability may be related to the absence of a uniform definition of AKI in this setting. The purpose of this study was to identify the degree of AKI that is associated with long‐term adverse outcome. Furthermore, to determine the best definition (for use in future studies) of AKI not requiring dialysis in post–liver transplant patients, we retrospectively reviewed the effect of 3 definitions of AKI post–orthotopic liver transplantation on renal and patient outcome between 1997 and 2005. We compared patients with AKI to a control group without AKI by each definition. AKI was defined in 3 groups as an acute rise in serum creatinine, from the pretransplant baseline, of >0.5 mg/dL, >1.0 mg/dL, or >50% above baseline to a value above 2 mg/dL. In all groups, the glomerular filtration rate was significantly lower at both 1 and 2 years post‐transplant. Patient survival was worse in all groups. Graft survival was worse in all groups. The incidence of AKI was highest in the group with a rise in creatinine of >0.5 mg/dL (78%) and lowest in patients with a rise in creatinine of >50% above 2.0 mg/dL (14%). Even mild AKI, defined as a rise in serum creatinine of >0.5 mg/dL, was associated with reduced patient and graft survival. However, in comparison with the other definitions, the definition of AKI with the greatest impact on patients outcome post–liver transplant was a rise in serum creatinine of >50% above baseline to >2 mg/dL. Liver Transpl 15:475–483, 2009.


Liver Transplantation | 2009

Nonalcoholic fatty liver disease after liver transplantation for cryptogenic cirrhosis or nonalcoholic fatty liver disease

Kanthi Yalamanchili; Sherif Saadeh; G. Klintmalm; Linda W. Jennings; Gary L. Davis

Nonalcoholic steatohepatitis (NASH) may account for many cases of cryptogenic cirrhosis. If so, then steatosis might recur after liver transplantation. Two thousand fifty‐two patients underwent primary liver transplantation for chronic liver disease between 1986 and 2004. Serial liver biopsy samples were assessed for steatosis and fibrosis. Two hundred fifty‐seven patients (12%) had a pretransplant diagnosis of cryptogenic cirrhosis (239) or NASH (18). Fatty liver developed in 31% and was more common when the pretransplant diagnosis was NASH (45% at 5 years versus 23% for cryptogenic cirrhosis, P = 0.007). NASH developed in only 4% and occurred exclusively when steatosis had already occurred. Steatosis after liver transplantation was associated with the baseline body weight and body mass index by univariate analyses, but no pretransplant or posttransplant characteristic independently predicted steatosis after liver transplantation because obesity was so common in all groups. Five percent and 10% developed bridging fibrosis or cirrhosis after 5 and 10 years, respectively, and this was more common after NASH (31%) than in those who developed steatosis alone (6%) or had no fat (3%, P = 0.002). One‐, 5‐, and 10‐year survival was the same in patients who underwent transplantation for cryptogenic cirrhosis or NASH (86%, 71%, and 56%) and in patients who underwent transplantation for other indications (86%, 71%, and 53%; not significant), but death was more often due to cardiovascular disease and less likely from recurrent liver disease. In conclusion, fatty liver is common after liver transplantation for cryptogenic cirrhosis or NASH but is twice as common in the latter group; this suggests that some cryptogenic cirrhosis, but perhaps not all, is caused by NASH. Posttransplant NASH is unusual, and steatosis appears to be a prerequisite. Advanced fibrosis is uncommon, and survival is the same as that of patients who undergo transplantation for other causes. Liver Transpl 16:431‐439, 2010.


American Journal of Transplantation | 2013

De Novo Donor-Specific HLA Antibodies Decrease Patient and Graft Survival in Liver Transplant Recipients

Hugo Kaneku; Jacqueline G. O'Leary; N. Banuelos; Linda W. Jennings; Brian M. Susskind; Goran B. Klintmalm; Paul I. Terasaki

The role of de novo donor‐specific HLA antibodies (DSA) in liver transplantation remains unknown as most of the previous studies have only focused on preformed HLA antibodies. To understand the significance of de novo DSA, we designed a retrospective cohort study of 749 adult liver transplant recipients with pre‐ and posttransplant serum samples that were analyzed for DSA. We found that 8.1% of patients developed de novo DSA 1 year after transplant; almost all de novo DSAs were against HLA class II antigens, and the majority were against DQ antigens. In multivariable modeling, the use of cyclosporine (as opposed to tacrolimus) and low calcineurin inhibitor levels increased the risk of de novo DSA formation, while a calculated MELD score >15 at transplant and recipient age >60 years old reduced the risk. Multivariable analysis also demonstrated that patients with de novo DSA at 1‐year had significantly lower patient and graft survival. In conclusion, we demonstrate that de novo DSA development after liver transplantation is an independent risk factor for patient death and graft loss.


American Journal of Transplantation | 2011

High mean fluorescence intensity donor-specific anti-HLA antibodies associated with chronic rejection Postliver transplant.

J. G. O’Leary; Hugo Kaneku; Brian M. Susskind; Linda W. Jennings; Michael A. Neri; Gary L. Davis; Goran B. Klintmalm; Paul I. Terasaki

In contrast to kidney transplantation where donor‐specific anti‐HLA antibodies (DSA) negatively impact graft survival, correlation of DSA with clinical outcomes in patients after orthotopic liver transplantation (OLT) has not been clearly established. We hypothesized that DSA are present in patients who develop chronic rejection after OLT. Prospectively collected serial serum samples on 39 primary OLT patients with biopsy‐proven chronic rejection and 39 comparator patients were blinded and analyzed for DSA using LABScreen® single antigen beads test, where a 1000 mean fluorescence value was considered positive. In study patients, the median graft survival was 15 months, 74% received ≥ one retransplant, 20% remain alive and 87% had ≥ one episode of acute rejection. This is in contrast to comparator patients where 69% remain alive, and no patient needed retransplant or experienced rejection. Thirty‐six chronic rejection patients (92%) and 24 (61%) comparator patients had DSA (p = 0.003). Chronic rejection versus comparator patients had higher mean fluorescence intensity (MFI) DSA. Although a further study with larger numbers of patients is needed to identify clinically significant thresholds, there is an association of high‐MFI DSA with chronic rejection after OLT.


Transplantation | 1994

Hla Compatibility And Liver Transplant Outcome Improved Patient Survival by Hla and Cross-matching

A. Nikaein; Backman L; Linda W. Jennings; M. F. Levy; Robert M. Goldstein; Thomas A. Gonwa; Marvin J. Stone; Goran B. Klintmalm

In liver transplantation (LTx), numerous studies have failed to demonstrate an adverse effect of HLA-A,B,DR incompatibility or of donor-specific positive cross-match on survival of the recipients. In this study, we examined the effect of antidonor cytotoxic antibody and HLA compatibility in 800 LTx recipients with CsA-based immunosuppression. Thirty-four of 482 (7%) recipients were transplanted across a positive donor-specific T cell cross-match. Four-year patient and graft survival was 71% and 67%, respectively, in negative cross-match recipients and 53% and 50%, respectively, in positive cross-match recipients (P=0.0051 and P=0.023). Neither B cell-positive cross-match nor the presence of panel reactive antibody (PRA) had an adverse impact on the liver allograft outcome. Interestingly, 21/58 (36.2%) patients with PRA ≤ 10% had a positive T cell cross-match, whereas only 7/382 (1.8%) patients with PRA < 10% did (P<0.0001). This indicates the predictive value of PRA cross-match results. B lymphocyte cross-match results also were strongly correlated with the presence of PRA, as 26/57 (45.6%) of the patients with PRA ≥ 10% had a positive cross-match, whereas only 22/394 (5.6%) with PRA < 10% did (P<0.0001). Analysis of HLA compatibility demonstrated a significant impact on patients survival, comparing only 0–2 vs. 6 HLA-A+B+DR mismatches and 0 vs. 1 vs. 2 HLA-DR mismatches. Four-year patient survival rate for 0 to 2 antigen mismatches was 86%, whereas for 6 antigen mismatches it was 62% (P=0.025). Overall actuarial 4-year patient survival rate in HLA-DR-mismatched groups (0 vs. 1 vs. 2) was 84%, 73%, and 64%, respectively (P=0.033). In no mismatched category was graft survival rate significantly different. Sepsis or rejection was the cause of graft loss in 1/10 (10%), 21/75 (28%), and 34/85 (40%) patients with 0, 1, and 2 HLA-DR mismatches, respectively. The difference between patient and graft survival was accounted for by survival after retransplantation, which was lower in patients with more HLA-DR mismatches in primary transplants. The latter group received intensive immunosuppressive therapy during the first month after primary transplantation, as compared with those with fewer HLA-DR mismatches (P=0.04).


Annals of Surgery | 2001

The elderly liver transplant recipient : A call for caution

Marlon F. Levy; Ponnandai S. Somasundar; Linda W. Jennings; Ghap Jung; Ernesto P. Molmenti; Carlos G. Fasola; Robert M. Goldstein; Thomas A. Gonwa; Goran B. Klintmalm

ObjectiveTo determine whether liver transplantation is judicious in recipients older than 60 years of age. Summary Background DataThe prevailing opinion among the transplant community remains that elderly recipients of liver allografts fare as well as their younger counterparts, but our results have in some cases been disappointing. This study was undertaken to review the results of liver transplants in the elderly in a large single-center setting. A secondary goal was to define, if possible, factors that could help the clinician in the prudent allocation of the donor liver. MethodsA retrospective review of a prospectively maintained single-institution database of 1,446 consecutive liver transplant recipients was conducted. The 241 elderly patients (older than 60 years) were compared with their younger counterparts by preoperative laboratory values, illness severity, nutritional status, and donor age. Survival data were stratified and logistic regression analyses were conducted. ResultsElderly patients with better-preserved hepatic synthetic function or with lower pretransplant serum bilirubin levels fared as well as younger patients. Elderly patients who had poor hepatic synthetic function or high bilirubin levels or who were admitted to the hospital had much lower survival rates than the sicker younger patients or the less-ill older patients. Recipient age 60 years or older, pretransplant hospital admission, and high bilirubin level were independent risk factors for poorer outcome. ConclusionsLow-risk elderly patients fare as well as younger patients after liver transplantation. However, unless results can be improved, high-risk patients older than 60 years should probably not undergo liver transplantation.

Collaboration


Dive into the Linda W. Jennings's collaboration.

Top Co-Authors

Avatar

Goran B. Klintmalm

Baylor University Medical Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marlon F. Levy

Baylor University Medical Center

View shared research outputs
Top Co-Authors

Avatar

Edmund Q. Sanchez

Baylor University Medical Center

View shared research outputs
Top Co-Authors

Avatar

Thomas A. Gonwa

Baylor University Medical Center

View shared research outputs
Top Co-Authors

Avatar

G. Klintmalm

Baylor University Medical Center

View shared research outputs
Top Co-Authors

Avatar

M. F. Levy

Baylor University Medical Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nicholas Onaca

Baylor University Medical Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge