Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John P. McVicar is active.

Publication


Featured researches published by John P. McVicar.


Clinical Pharmacology & Therapeutics | 1996

First-pass metabolism of midazolam by the human intestine

Mary F. Paine; Danny D. Shen; Kent L. Kunze; James D. Perkins; Christopher L. Marsh; John P. McVicar; Darlene Barr; Bruce S. Gillies; Kenneth E. Thummel

The in vivo intestinal metabolism of the CYP3A probe midazolam to its principal metabolite, 1′‐hydroxymidazolam, was investigated during surgery in 10 liver transplant recipients. After removal of the diseased liver, five subjects received 2 mg midazolam intraduodenally, and the other five received 1 mg midazolam intravenously. Simultaneous arterial and hepatic portal venous blood samples were collected during the anhepatic phase; collection of arterial samples continued after reperfusion of the donor liver. Midazolam, 1′‐hydroxymidazolam, and 1′‐hydroxymidazolam glucuronide were measured in plasma. A mass balance approach that considered the net change in midazolam (intravenously) or midazolam and 1′‐hydroxymidazolam (intraduodenally) concentrations across the splanchnic vascular bed during the anhepatic phase was used to quantitate the intestinal extraction of midazolam after each route of administration. For the intraduodenal group, the mean fraction of the absorbed midazolam dose that was metabolized on transit through the intestinal mucosa was 0.43 ± 0.18. For the intravenous group, the mean fraction of midazolam extracted from arterial blood and metabolized during each passage through the splanchnic vascular bed was 0.08 ± 0.11. Although there was significant intersubject variability, the mean intravenous and intraduodenal extraction fractions were statistically different (p = 0.009). Collectively, these results show that the small intestine contributes significantly to the first‐pass oxidative metabolism of midazolam catalyzed by mucosal CYP3A4 and suggest that significant first‐pass metabolism may be a general phenomenon for all high‐turnover CYP3A4 substrates.


Gastrointestinal Endoscopy | 1998

Endoscopic management of biliary strictures in liver transplant recipients: Effect on patient and graft survival

Rafat Rizk; John P. McVicar; Mary J. Emond; Charles A. Rohrmann; Kris V. Kowdley; James D. Perkins; Robert L. Carithers; Michael B. Kimmey

BACKGROUND Biliary strictures in liver transplant recipients cause significant morbidity and can lead to reduced patient and graft survival. METHODS Of 251 liver transplant recipients, 22 patients with biliary strictures were categorized into two groups: donor hepatic duct (n = 12) or anastomotic (n = 10). Strictures were dilated and stented. Endoscopic therapy was considered successful if a patient did not require repeat stenting or dilation for 1 year. RESULTS Patient and graft survival did not differ significantly in the 22 patients compared with patients without strictures (relative risk of death and graft survival 1.8 and 1.3). Donor hepatic duct strictures required significantly longer therapy than anastomotic strictures (median days 185 versus 67, p = 0.02). Twenty-two months after the first endoscopic treatment, 73% of the donor hepatic duct stricture group were stent free compared with 90% of the anastomotic group (p = 0.02). The former group had significantly more (p < 0.05) hepatic artery thrombosis (58.3% versus 10%), cholangitis (58.3% versus 30%), choledocholithiasis (91% versus 10%), and endoscopic interventions. No patient undergoing endoscopic treatment required retransplantation or biliary reconstruction during a median follow-up of 35.7 months. CONCLUSION Endoscopic therapy of biliary strictures after liver transplantation is effective and is not accompanied by reduced patient or graft survival.


Liver Transplantation | 2007

De novo nonalcoholic fatty liver disease after liver transplantation

Suk Seo; Kalyani Maganti; Manjit Khehra; Rajendra Ramsamooj; Alex Tsodikov; Christopher L. Bowlus; John P. McVicar; Mark A. Zern; Natalie J. Török

Hepatic steatosis is a recognized problem in patients after orthotopic liver transplant (OLT). However, de novo development of nonalcoholic fatty liver disease (NAFLD) has not been well described. The aim of this study was to determine the prevalence and predictors of de novo NAFLD after OLT. A retrospective analysis was performed on 68 OLT patients with donor liver biopsies and posttransplantation liver biopsies. Individual medical charts were reviewed for demographics, indication for OLT, serial histology reports, genotypes for hepatitis C, comorbid conditions, and medications. Liver biopsies were reviewed blindly and graded according to the Brunt Scoring System. Multivariate logistic regression analysis was used to study the risk factors for developing NAFLD. The interval time from OLT to subsequent follow‐up liver biopsy was 28 ± 18 months. A total of 12 patients (18%) developed de novo NAFLD, and 6 (9%) developed de novo NASH. The regression model indicated that the use of angiotensin‐converting enzyme inhibitors (ACE‐I) was associated with a reduced risk of developing NAFLD after OLT (odds ratio, 0.09; 95% confidence interval, 0.010‐0.92; P = 0.042). Increase in body mass index (BMI) of greater than 10% after OLT was associated with a higher risk of developing NAFLD (odds ratio, 19.38; 95% confidence interval, 3.50‐107.40; P = 0.001). In conclusion, de novo NAFLD is common in the post‐OLT setting, with a significant association with weight gain after transplant. The use of an ACE‐I may reduce the risk of developing post‐OLT NAFLD. Liver Transpl, 2006.


Transplantation | 2003

Higher surgical wound complication rates with sirolimus immunosuppression after kidney transplantation: A matched-pair pilot study

Christoph Troppmann; Jonathan L. Pierce; Mehul M. Gandhi; Brian J. Gallay; John P. McVicar; Richard V. Perez

Sirolimus, a potent new immunosuppressant, has been anecdotally associated with surgical wound complications. We studied postoperative surgical wound complications in 15 kidney recipients receiving sirolimus, prednisone, and tacrolimus or cyclosporine (study group) compared with 15 recipients receiving tacrolimus, prednisone, and mycophenolate mofetil who were pair-matched for surgical wound complication risk factors. Surgical wound complications were defined as any complication related to the surgical transplant wound requiring reintervention. Fifty-three percent of the study group and 7% of the control group experienced more than one surgical wound complication (P =0.014), and the relaparotomy incidence was 33% and 7%, respectively. Four graft losses have occurred since the beginning of the study: one chronic rejection and two deaths with function in the study group, and one death with function in the control group. At 1 year, graft survival for study recipients compared with control recipients was 87% and 93%, respectively; patient survival was 93% in both groups. Recipients receiving sirolimus demonstrated a significantly higher surgical wound complication rate, but graft and patient survival were not affected. Peritransplant immunosuppression with sirolimus and steroids warrants careful consideration, particularly in recipients with surgical complication risk factors.


Transplantation | 1997

Renal disease in hepatitis C-positive liver transplant recipients.

Elizabeth Kendrick; John P. McVicar; Kris V. Kowdley; Mary P. Bronner; Mary J. Emond; Charles E. Alpers; David R. Gretch; Robert L. Carithers; James D. Perkins; Connie L. Davis

Glomerular abnormalities are frequent in patients undergoing liver transplantation; however, renal dysfunction following transplantation is mainly attributed to cyclosporine toxicity. Membranoproliferative glomerulonephritis (MPGN) is seen in patients infected with hepatitis C virus (HCV), the virus responsible for 30% of the end-stage liver disease leading to liver transplantation. To determine the incidence of renal abnormalities in liver transplant recipients and the association with HCV, we undertook a longitudinal study in HCV-positive (n=91) and HCV-negative (n=106) liver transplant recipients. Mean creatinine clearance before transplantation was 94 ml/min/1.73 m2 in HCV+ patients and 88 ml/min/1.73 m2 in HCV- patients. By 3 months after transplantation, the mean creatinine clearance decreased by approximately one third in both groups. A greater proportion of HCV+ patients excreted >2 g protein/day after transplantation (P=0.05) and had renal biopsies showing MPGN than did HCV- recipients (4/10 HCV+ patients vs. 0/7 HCV- patients; P=0.1). In the HCV+ group, proteinuria was not associated with recurrent HCV hepatitis, DQ matching, posttransplant diabetes, or hypertension. Treatment of HCV-related MPGN with interferon-alpha2b appeared to stabilize proteinuria and renal function but did not reverse renal dysfunction nor cause liver allograft rejection. After transplantation, HCV+ patients had similar renal function over 3 years after transplantation, compared with HCV- patients, but they had an increased risk of proteinuria and occurrence of MPGN that was only partially responsive to interferon.


Transplantation | 2000

Pretransplant systemic inflammation and acute rejection after renal transplantation.

Richard V. Perez; David Brown; Steven Katznelson; Hans-Georg Müller; Tammy Chang; Steven M. Rudich; John P. McVicar; George A. Kaysen

BACKGROUND There are presently no established pre-transplant tests that consistently identify patients who may be at increased risk for acute rejection episodes after renal transplantation. We studied whether pretransplant serum levels of C-reactive protein (CRP), a marker for the presence of systemic inflammation, would predict the occurrence of acute rejection episodes after renal transplantation. METHODS Pretransplant serum was tested for CRP level in 97 consecutive renal transplant recipients. Time to acute rejection after transplantation was stratified by CRP level and compared using the Kaplan-Meier method. In addition, Cox regression multivariate analysis was performed to assess whether any pretransplant covariates could independently predict the subsequent occurrence of acute rejection episodes. RESULTS Pretransplant mean CRP levels were higher in patients who subsequently had a rejection episode versus those who had no rejection (22.2+/-2.9 vs. 11.7+/-1.8 microg/ml, respectively, P=0.003). Patients less than the median CRP value had a significantly longer time to rejection compared to those with higher CRP levels (P=0.002). Similarly, patients within the lowest CRP quartile had longer times to rejection when compared with the highest quartile (P=0.006). Cox proportional hazards regression multivariate analysis identified CRP level as the only independent pretransplant risk factor for rejection identified (P=0.044). CONCLUSIONS Pretransplant systemic inflammation as manifested by elevated serum CRP level independently predicts the risk of acute rejection after renal transplantation and may be useful in stratifying patients at the time of transplantation according to immunological risk. Thus, assessment of pretransplant systemic inflammatory status may be helpful in prospective individualization of immunosuppression therapy after renal transplantation.


The Journal of Urology | 1995

Use of Ultrasound and Cystoscopically Guided Pancreatic Allograft Biopsies and Transabdominal Renal Allograft Biopsies: Safety and Efficacy in Kidney-Pancreas Transplant Recipients

Christian S. Kuhr; Connie L. Davis; Darlene Barr; John P. McVicar; James D. Perkins; Carlos E. Bachi; Charles E. Alpers; Christopher L. Marsh

The use of allograft biopsies to guide treatment after solid organ transplantation is a valuable tool in the detection and treatment of rejection. Prior development and use of the cystoscopically guided pancreatic allograft biopsy have allowed for more accurate and timely diagnosis of pancreatic allograft dysfunction, possibly contributing to our 1-year pancreas graft, renal allograft and patient survival rates of 87.1%, 88.5% and 96.8%, respectively. We reviewed our experience, examining efficacy and complication rates of pancreas and kidney biopsies in 31 cadaveric pancreas or combined kidney and pancreas transplants performed between June 1990 and February 1992 with at least 1 year of followup. There were 94 pancreas, 54 kidney and 53 duodenal mucosal biopsies in 29 evaluable patients. This biopsy technique uses a 24.5F side-viewing nephroscope to view the cystoduodenostomy, with the duodenum acting as a portal for biopsy needles into the pancreas. Pancreatic tissue is obtained with either an 18 gauge, 500 mm. Menghini aspiration/core needle or an 18 gauge, 500 mm. Roth core needle. Percutaneous renal allograft biopsies are performed independently or simultaneously with the pancreas biopsies using a 16 gauge spring loaded needle. Pancreas biopsies were prompted by clinical indications of rejection (decreased urinary amylase, increased serum amylase or increased serum creatinine) or by protocol (10, 21 and 40 days postoperatively). Among the biopsies 30% were required by protocol, of which 10 (36%) revealed abnormal pathological findings and 5 (18%) showed evidence of occult cellular rejection. Renal biopsies demonstrated rejection in 69% of the cases. Of simultaneous pancreas/kidney biopsies 33% revealed concomitant rejection. A total of 88 Menghini needles with 170 passes was used in 73 biopsy attempts, yielding 126 tissue cores with a 16% complication rate. A total of 41 Roth needles was used with 73 passes in 34 biopsy attempts, yielding 55 tissue cores with a complication rate of 21%. Complications included self-limited bleeding from the biopsy site in 13% of the cases, bleeding requiring clot evacuation and fulguration in 1% and asymptomatic hyperamylasemia in 12%. Renal biopsy complications included 1 arteriovenous fistula (2%). We conclude that ultrasound and cystoscopically guided pancreatic allograft biopsy and percutaneous renal allograft biopsies are safe and essential methods of obtaining tissue for histological diagnosis without serious sequelae. The Menghini and Roth needles in cystoscopically guided pancreatic allograft biopsy have similar yield and complication rates in obtaining pancreatic tissue, although they require different performance techniques. In some cases both needles are necessary and are complementary in obtaining adequate tissue.(ABSTRACT TRUNCATED AT 400 WORDS)


Transplantation | 2010

The transition from laparoscopic to retroperitoneoscopic live donor nephrectomy: a matched pair pilot study.

Christoph Troppmann; Michael F. Daily; John P. McVicar; Kathrin M. Troppmann; Richard V. Perez

Background. Retroperitoneoscopic live donor nephrectomy (RetroNeph) offers an intrinsic advantage over conventional transperitoneal laparoscopic nephrectomy (LapNeph) because of the potentially lower risk for early and late intraperitoneal donor complications. RetroNeph, however, is infrequently performed and has not been systematically and directly compared with LapNeph in nonselected donors. Methods. In November 2007, after 10 years of programmatic experience with transperitoneal LapNeph, we implemented RetroNeph at once for all live donor nephrectomies. Donor selection criteria, laparoscopic port positions, and hand-assistance mode were identical for RetroNeph and preceding LapNeph donors. We compared outcomes of retroperitoneoscopically completed cases with those of previous transperitoneal LapNeph cases that were pair matched for donor sex, body mass index, and donor kidney laterality. Results. Of the first 52 donor nephrectomies (48 left, 4 right) consecutively started with the intent to perform a RetroNeph November 2007 to April 2009, 45 (87%) were completed retroperitoneoscopically, and seven (13%) were switched intraoperatively to transperitoneal LapNeph. We observed no conversions to open nephrectomy, donor blood transfusions, readmissions, or reoperations. Matched-pair analysis of the 45 RetroNeph versus 45 LapNeph cases showed no significant differences for warm ischemia time and other donor outcomes, delayed graft function rates, recipient creatinine at 1 week, and 1-year graft survival. Conclusions. Implementation of a RetroNeph program had no adverse impact on donor morbidity and quality of early graft function. Our pilot experience suggests that the RetroNeph learning curve is short. Given the potential advantages of an extraperitoneal approach for the donor, RetroNeph is an attractive alternative to LapNeph, particularly for surgeons with previous LapNeph experience.


Transplantation | 2004

Pretransplant recipient cytomegalovirus seropositivity and hemodialysis are associated with decreased renal allograft and patient survival

Jason T. Fitzgerald; Brian J. Gallay; Sarah E. Taranto; John P. McVicar; Christoph Troppmann; Xiaowu Chen; Matthew McIntosh; Richard V. Perez

Background. Pretransplant systemic inflammation has been associated with decreased renal allograft survival, and infectious agents such as cytomegalovi-rus (CMV) may play a role. We hypothesized that pretransplant CMV seropositivity is a risk factor for decreased patient and allograft survival after cadaveric renal transplantation and that other factors believed to modulate systemic inflammation, such as dialysis modality, might act synergistically with CMV to decrease patient and allograft survival. Methods. The United Network for Organ Sharing database was reviewed to identify all patients undergoing cadaveric renal transplantation in the United States from 1988 to 1997. Outcomes for CMV seropositive and seronegative recipients of organs from CMV seronegative donors were analyzed. Subgroup analysis was performed to identify any synergistic influence on outcome between CMV serostatus and known determinants of risk, including degree of human leukocyte antigen mismatch, pretransplant dialysis, and cold ischemia time. Results. Of 29,875 patients who underwent transplantation, 12,239 were CMV seronegative and 17,636 were CMV seropositive. Patient survival was decreased by pretransplant seropositivity (relative risk [RR] 1.11, P =0.001). In addition, this group demonstrated worse overall allograft survival (RR 1.05, P =0.029), although this adverse effect disappeared when patients who died with a functioning graft were censored. Decreased allograft survival was most pronounced in patients who were on hemodialysis before transplantation (RR 1.62, P =0.004). Conclusions. Pretransplant CMV seropositivity is associated with decreased patient survival. Pretransplant CMV seropositivity and hemodialysis have a synergistic adverse effect on graft survival, independent of patient mortality. Additional studies are required to define mechanisms by which pretransplant CMV infection and dialysis modality may contribute to decreased allograft survival.


American Journal of Transplantation | 2004

Impact of portal venous pancreas graft drainage on kidney graft outcome in simultaneous pancreas-kidney recipients reported to UNOS.

Christoph Troppmann; David W. Gjertson; J. Michael Cecka; John P. McVicar; Richard V. Perez

Clinical data on the potential immunologic impact of portal (PD) vs. systemic (SD) venous pancreas graft drainage on outcome remains controversial.

Collaboration


Dive into the John P. McVicar's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Darlene Barr

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kris V. Kowdley

Virginia Mason Medical Center

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge