B.F. Meyers
Washington University in St. Louis
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by B.F. Meyers.
Transplantation | 2007
Ankit Bharat; Kishore Narayanan; Tyler Street; Ryan C. Fields; Nancy Steward; Aviva Aloush; B.F. Meyers; Richard B. Schuessler; Elbert P. Trulock; G. Alexander Patterson; Thalachallour Mohanakumar
Background. Chronic human lung allograft rejection, represented by bronchiolitis obliterans syndrome (BOS), is the single most important factor that limits the long-term survival following lung transplantation (LT). However, the pathogenesis of BOS remains unclear. We hypothesized that the early posttransplant inflammation would promote the development of donor anti–human leukocyte antigen (HLA) alloimmunity and predispose to BOS. Methods. Serum levels of interleukin (IL)-1&bgr;, IL-2, IL-4, IL-5, IL-6, IL-7, IL-8, IL-10, IL-12, IL-13, IL-15, IL-17, Eotaxin, IP-10, MIG, MCP-1, MIP-1&agr;, MIP-1&bgr;, RANTES, tumor necrosis factor (TNF)-&agr;, interferon (IFN)-&agr;, IFN-&ggr;, granulocyte-macrophage colony-stimulating factor, IL-1R&agr;, and IL-2R were serially analyzed in 31 BOS+ and matched 31 BOS− patients using quantitative multiplex bead immunoassays. Donor-specific HLA class II cellular immunity was analyzed using enzyme-linked immunospot (ELISPOT) by testing recipient peripheral blood mononuclear cells against mismatched donor HLA-DR peptides. Anti-HLA class II antibodies were monitored using flow panel reactive antibodies. Results. There was early posttransplant elevation in basal serum levels of proinflammatory chemokines IP-10 and MCP-1 and Th1-cytokines IL-1&bgr;, IL-2, IL-12, and IL-15 in BOS+ patients, compared to BOS− and normal subjects. In addition, a threefold decline in IL-10 levels was found during BOS development. BOS+ patients revealed increased development of HLA class II alloantibodies and Th1-predominant donor-specific cellular immunity with high frequency of IFN-&ggr; and low IL-5 producing T-cells. Conclusion. Early posttransplant elevation of proinflammatory mediators is associated with alloimmunity and chronic human lung allograft rejection.
The Annals of Thoracic Surgery | 2013
Michael S. Kent; Rodney J. Landreneau; Sumithra J. Mandrekar; Shauna L. Hillman; Francis C. Nichols; David R. Jones; Sandra L. Starnes; A.D. Tan; Joe B. Putnam; B.F. Meyers; Benedict Daly; Hiran C. Fernando
BACKGROUND Patients with early-stage lung cancer and limited pulmonary reserve may not be appropriate candidates for lobectomy. In these situations, sublobar resection (wedge or segmentectomy) is generally performed. Many physicians believe that segmentectomy is superior because it allows for an improved parenchymal margin and nodal sampling. METHODS We performed an analysis using operative and pathology reports collected as part of planned data collection for American College of Surgeons Surgical Oncology Group (ACOSG) Z4032. This was a prospective trial in which patients with clinical stage I lung cancer and limited pulmonary function were randomized to sublobar resection with or without brachytherapy. The operative approach (video-assisted thoracic surgery [VATS] vs thoracotomy), extent of resection, and degree of lymph node evaluation were at the discretion of the individual surgeon. The primary aim of this analysis was to compare the parenchymal margin achieved between segmentectomy and wedge resection. Secondary aims included the extent of nodal staging and whether the operative approach (VATS vs open) had an effect on margin status and nodal evaluation. RESULTS Among 210 patients, 135 (64%) underwent a VATS approach and 75 (36%) a thoracotomy. A segmentectomy was performed in 57 patients (27%) and a wedge resection in 153 patients (73%). There were no significant differences in the degree of nodal upstaging, stations sampled, or parenchymal margin obtained between VATS and thoracotomy. However, significant differences were observed between patients who underwent a segmentectomy and those who underwent a wedge resection with regard to parenchymal margin (1.5 cm vs 0.8 cm, p = 0.0001), nodal upstaging (9% vs 1%, p = 0.006), and nodal stations sampled (3 vs 1, p < 0.0001) . Notably, 41% of patients treated by wedge resection had no nodes sampled at the time of operation compared with 2% of those who underwent segmentectomy (p < 0.0001). CONCLUSIONS In ACOSG Z4032, wedge resection, regardless of the approach, was associated with a smaller parenchymal margin and a lower yield of lymph nodes and rate of nodal upstaging when compared with segmentectomy.
Annals of Oncology | 2014
A.C. Lockhart; Carolyn E. Reed; Paul A. Decker; B.F. Meyers; Mark K. Ferguson; A. R. Oeltjen; Joe B. Putnam; Stephen D. Cassivi; A. J. Montero; Tracey E. Schefter
BACKGROUND Preoperative chemoradiotherapy (CRT) improves outcomes in patients with locally advanced but resectable adenocarcinoma of the esophagus. ACOSOG Z4051 evaluated CRT with docetaxel, cisplatin, and panitumumab (DCP) in this patient group with a primary end point of a pathologic complete response (pCR) ≥35%. PATIENTS AND METHODS From 15 January 2009 to 22 July 2011, 70 patients with locally advanced but resectable distal esophageal adenocarcinoma were enrolled. Patients received docetaxel (40 mg/m(2)), cisplatin (40 mg/m(2)), and panitumumab (6 mg/kg) on weeks 1, 3, 5, 7, and 9 with RT (5040 cGy, 180 cGy/day × 28 days) beginning week 5. Resection was planned after completing CRT. PCR was defined as no viable residual tumor cells. Secondary objectives included near-pCR (≤10% viable cancer cells), toxicity, and overall and disease-free survival. Adverse events were graded using the CTCAE Version 3.0. RESULTS Five of 70 patients were ineligible. Of 65 eligible patients (59 M; median age 61), 11 did not undergo surgery, leaving 54 assessable. PCR rate was 33.3% and near-pCR was 20.4%. Secenty-three percent of patients completed DCP (n = 70) and 92% completed RT. 48.5% had toxicity ≥grade 4. Lymphopenia (43%) was most common. Operative mortality was 3.7%. Adult respiratory distress syndrome was encountered in two patients (3.7%). At median follow-up of 26.3 months, median overall survival was 19.4 months and 3-year overall survival was 38.6% (95% confidence interval 24.5% to 60.8%). CONCLUSIONS Neoadjuvant CRT with DCP is active (pCR + near-pCR = 53.7%) but toxicity is significant. Further evaluation of this regimen in an unselected population is not recommended. CLINICALTRIALSGOV IDENTIFIER NCT00757172.
Transplantation | 2006
Ryan C. Fields; Ankit Bharat; Nancy Steward; Aviva Aloush; B.F. Meyers; Elbert P. Trulock; William C. Chapman; G. Alexander Patterson; Thalachallour Mohanakumar
Background. The long-term function of lung transplants is limited by chronic rejection (bronchiolitis obliterans syndrome, BOS). Due to lack of specific markers, BOS is diagnosed clinically. Because there is strong evidence that alloimmunity plays a significant role in the pathogenesis of BOS, we investigated whether soluble CD30 (sCD30), a T-cell activation marker, would correlate with BOS. Methods. Sera collected serially from BOS+ (n=20) and matched BOS− (n=20) lung transplant (LT) patients were analyzed for sCD30 by enzyme-linked immunosorbent assay. Pretransplant sera and sera from normal donors were also analyzed. Results. PreLT levels were comparable to normal subjects. However, posttransplant there was a significant elevation in sCD30 levels during BOS development in all BOS+ patients, compared to BOS− (mean 139.8±10.7 vs. 14.8±2.7 U/ml, P<0.001). sCD30 levels declined in the BOS+ patients but were still elevated compared to BOS− (48.52±5.04 vs. 7.19±2.9, P<0.0001). Conclusions. We conclude that sCD30 may represent a novel marker to monitor the development of BOS.
Journal of Heart and Lung Transplantation | 2010
Deepti Saini; Nataraju Angaswamy; Venkataswarup Tiriveedhi; Naohiko Fukami; Ramsey Hachem; Elbert P. Trulock; B.F. Meyers; Alexander Patterson; Thalachallour Mohanakumar
BACKGROUND This study aims to determine the role of antibodies to donor-mismatched human leukocyte antigen (HLA) developed during the post-transplant period in inducing defensins and their synergistic role in the pathogenesis of chronic rejection, bronchiolitis obliterans syndrome (BOS), after human lung transplantation (LTx). METHODS Bronchoalveolar lavage (BAL) and serum from 21 BOS+ LTx patients were assayed for β-defensins human neutrophil peptides (HNP) 1-3 (enzyme-linked immunosorbent assay [ELISA]) and anti-HLA antibodies (Luminex, Luminex Corp, Austin, TX). Human airway epithelial cells (AEC) were treated with anti-HLA antibodies, HNP-1/2, or both, and the levels of β-defensin were measured by ELISA. Using a mouse model of obliterative airway disease induced by anti-major histocompatibility (MHC) class-I antibodies, we quantitatively and qualitatively determined neutrophil infiltration by myeloperoxidase (MPO) staining and activity by MPO assay, and defensin levels in the BAL. RESULTS In human LTx patients, higher defensin levels correlated with presence of circulating anti-HLA antibodies (p < 0.05). AEC treated with anti-HLA antibodies or HNP-1/2, produced β-defensin with synergistic effects in combination (612 ± 06 vs 520 ± 23 pg/ml anti-HLA antibody, or 590 ± 10 pg/ml for HNP treatment; p < 0.05). Neutrophil numbers (6-fold) and activity (5.5-fold) were higher in the lungs of mice treated with anti-MHC antibodies vs control. A 2-fold increase in α-defensin and β-defensin levels was also present in BAL on Day 5 after anti-MHC administrations. CONCLUSIONS Anti-HLA antibodies developed during the post-transplant period and α-defensins stimulated β-defensin production by epithelial cells, leading to increased cellular infiltration and inflammation. Chronic stimulation of epithelium by antibodies to MHC and resulting increased levels of defensins induce growth factor production and epithelial proliferation contributing to the development of chronic rejection after LTx.
Journal of Heart and Lung Transplantation | 2003
Stuart C. Sweet; M.T. de la Morena; P.M Schuler; G.A. Patterson; B.F. Meyers; D. Schuller; Charles B. Huddleston; Eric N. Mendeloff
Abstract Living donor lobar transplant (LDLT) has been used to address the ongoing shortage of cadaveric lungs and unpredictable course of end-stage lung disease. Because LDLT involves putting two healthy patients at risk, we compared outcomes between cadaveric lung transplant and LDLT. From July 1994 to May 2002, 38 LDLTs were performed at this center. A cohort of cadaveric recipients, also transplanted here during that era, matched by age at transplant, sex and underlying disease was selected to serve as a controls. LDLT recipients were less stable at the time of transplant: more patients (12/38) in the LDLT cohort were mechanically ventilated compared to 2/38 in the controls. The average ischemic times were significantly shorter in the LDLT patients (1:32) compared to controls (4:46). One and 3 year survival for the LDLT recipients was 60 % and 48% compared to 89% and 58% for the control cohort. The difference in Kaplan-Meier survival curves was statistically significant (P=0.03). There was no difference in the incidence of acute rejection between the LDLT and control cohorts. Post-transplant FEV 1 and FVC were not significantly different. However, freedom from bronchiolitis obliterans syndrome (BOS) at 1 and 3 years was 92% and 85% compared to 75% and 53% in the cadaveric cohort (P=0.03). Not surprisingly, the causes of death in the LDLT population reflected this difference. Only 2/24 (8%) of deaths in the LDLT cohort were due to BOS compared to 8/18 (45%) of deaths in the control cohort. In contrast, 11/24 (46%) of deaths in the LDLT cohort were related to infection compared to 2/18 (11%) in the control population. Based on this comparison, we conclude that LDLT can be performed successfully in pediatric patients. Although the decreased incidence of BOS makes us optimistic about the long-term survival of LDLT recipients, efforts focusing on minimizing infectious complications are necessary to improve overall survival.
Journal of Heart and Lung Transplantation | 2002
K.C Stewart; Tracey J. Guthrie; G Richardson; B.F. Meyers; John P. Lynch; G.A. Patterson; Elbert P. Trulock
after transplantation, thus, many immunosuppressants are directed towards the expression of IL2mRNA and, consequently, the synthesis of the protein. A more direct approach is the inhibition of the IL-2 receptor (IL-2R, CD25). Since the successful use of basiliximab (Simulect), an anti-CD25 chimeric monoclonal antibody, in renal transplantation, we have adopted this approach and have administered Simulect to 47 consecutive lung graft recipients. As controls served a group of 47 consecutive recipients that underwent surgery prior to the 47 Simulect patients. In the Simulect group (SIM) 40 patients (pts) received a bilateral (DLTx), 3 a single (SLTx) and 4 a heart lung transplant (HLTx), whereas in the control group (Ctrl) 33 DLTx, 12 SLTx (p 0.05) and 2 HLTx were performed. The mean recipient age in the SIM group was 41.8 yrs (13-64; 26 males, 21 females) and 42.7 yrs (15-65; 28 males, 19 females) in the Ctrl group. Indications for transplant in the SIM group were emphysema in 26%, IPF in 23%, CF in 15%, PPH in 9% and others in 28% of cases, compared to 34%, 32%, 8%, 4% and 21%, respectively, in the Ctrl group. The 30-day mortality was 4.3% in the SIM group and 14.9% in the Ctrl group (p 0.01). The 90-day mortality (n 42 pts/group) was 8.5% in the SIM and 23.4% in the Ctrl cohort (p 0.001). In 10 pts. (21%) in the SIM group and in 16 pts (35%) in the Ctrl group (p 0.05) at least 1 single shot of methylprednisone for the treatment of acute rejection episodes was administered within the first month following transplantation. In 2 cases in the Ctrl cohort therapy-refractory rejection episodes were fatal. Based on this initial experience on IL-2 receptor inhibition in lung recipients we conclude that this approach may indeed be effective in decreasing the incidence of acute lung graft rejection episodes and, therefore, may help to further improve the outcome following lung transplantation.
Journal of Heart and Lung Transplantation | 2003
Christine L. Lau; Tracey J. Guthrie; C. Chaparro; Denis Hadjiliadis; Thomas K. Waddell; Shaf Keshavjee; R.C Fields; Mark Yeatman; Scott M. Palmer; R.D. Davis; Joel D. Cooper; Elbert P. Trulock; G.A. Patterson; B.F. Meyers
of 100%, Sp of 45% and a PPV and NPV of 52 and 100% respectively . If we then assumed that patients had to have at least 2 measurements (3 to 6 weeks apart) of eNO 15ppb during the 3 months preceding the diagnosi s of BOS, only 4/11 CRnegative patients fulfilled that criterium, resulting in an improvement of Sp (80%) and PPV(73%), however, with some loss of S (92%) and NPV (94%), due to one false negative result. In Conclusion: the accuracy of eNO measurements for the diagnosis of chronic rejection after lung transplantation is high, when patients have 2 values of eNO 15ppb with 3-6 weeks in between the measurement.
Journal of Heart and Lung Transplantation | 2007
Ramsey Hachem; B.F. Meyers; Roger D. Yusen; G.A. Patterson; Elbert P. Trulock
Journal of Heart and Lung Transplantation | 2008
Ramsey Hachem; B.F. Meyers; Roger D. Yusen; A. Patterson; Elbert P. Trulock