John V. Conte
Johns Hopkins University School of Medicine
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by John V. Conte.
Journal of Heart and Lung Transplantation | 2011
Margaret M. Hannan; Shahid Husain; F. Mattner; Lara Danziger-Isakov; Richard J. Drew; G. Ralph Corey; Stephan Schueler; William L. Holman; Leo P. Lawler; Steve M. Gordon; Niall Mahon; John M. Herre; Kate Gould; Jose G. Montoya; Robert F. Padera; Robert L. Kormos; John V. Conte; Martha L. Mooney
In 2009, the International Society for Heart and Lung Transplantation (ISHLT) recognized the importance of infectionrelated morbidity and mortality in patients using ventricular assist devices (VADs) and the growing need for a consensusbased expert opinion to provide standard definitions of infections in these patients. The aim of these standard definitions is to improve clinical-investigator communication, allowing meaningful comparison in practice and outcomes between different centers and different VAD devices. In 2010, a core group of experts, including infectious diseases specialists, cardiologists, pathologists, radiologists, and cardiothoracic surgeons, formed an ISHLT Infectious Diseases Working Group to develop agreed criteria for definitions of infections in VAD patients. These definitions have been created by adapting and expanding on existing standardized definitions, which are based on the pathophysiology of equivalent infectious processes in prosthetic devices, such as cardiac prosthetic valve infections, intravascular catheter-related infections, and prosthetic joint infections. These definitions have been divided into 3 sections: VAD-specific infections, VAD-related infections, and non-VAD infections. Owing to the constant shortage of donor organs, new allocation systems, and improved medical therapies for congestive cardiac failure, the overwhelming trend in cardiac transplantation has been toward listing principally the most critically ill patients, that is, those requiring inpatient inotropic therapy for mechanical circulatory support (MCS). The ventricular assist device (VAD) has an expanding role in the management of these patients, both as a bridge to transplantation and as a destination therapy (ie, alternative to transplantation). According to United Network of Organ Sharing (UNOS) registry data, 9,000 transplant candidates have undergone MCS since 1999, comprising 33% of all listed patients and 75% of all listed inpatients. 1
American Journal of Transplantation | 2005
E. R. Rodriguez; Diane V. Skojec; Carmela D. Tan; Andrea A. Zachary; Edward K. Kasper; John V. Conte; William M. Baldwin
Antibody‐mediated rejection (AMR) in human heart transplantation is an immunopathologic process in which injury to the graft is in part the result of activation of complement and it is poorly responsive to conventional therapy. We evaluated by immunofluorescence (IF), 665 consecutive endomyocardial biopsies from 165 patients for deposits of immunoglobulins and complement. Diffuse IF deposits in a linear capillary pattern greater than 2+ were considered significant. Clinical evidence of graft dysfunction was correlated with complement deposits. IF 2+ or higher was positive for IgG, 66%; IgM, 12%; IgA, 0.6%; C1q, 1.8%; C4d, 9% and C3d, 10%. In 3% of patients, concomitant C4d and C3d correlated with graft dysfunction or heart failure. In these 5 patients AMR occurred 56–163 months after transplantation, and they responded well to therapy for AMR but not to treatment with steroids. Systematic evaluation of endomyocardial biopsies is not improved by the use of antibodies for immunoglobulins or C1q. Concomitant use of C4d and C3d is very useful to diagnose AMR, when correlated with clinical parameters of graft function. AMR in heart transplant patients can occur many months or years after transplant.
Journal of Heart and Lung Transplantation | 2001
Malcolm V. Brock; Marvin C Borja; Lawrence Ferber; Jonathan B. Orens; Roberto A Anzcek; Jerry A. Krishnan; Steven C Yang; John V. Conte
BACKGROUNDnBecause acute rejection is associated with inferior outcomes in lung transplantation, we have routinely employed OKT3, anti-thymocyte globulin (ATG), or daclizumab as adjuncts to reduce rejection.nnnMETHODnWe performed a 4-year prospective, controlled clinical trial of these 3 therapies to determine differences in post-operative infection, rejection, survival, and bronchiolitis obliterans syndrome (BOS). Eighty-seven consecutive lung transplant patients received OKT3 (n = 30), ATG (n = 34), and daclizumab (n = 23) as induction agents. The groups had similar demographics and immunosuppression protocols differing only in induction agents used.nnnRESULTSnNo differences were observed in immediate post-operative outcomes such as length of hospitalization, ICU stay, or time on ventilators. Twelve months post-transplant, OKT3 had more infections per patient than the other agents, a difference that only became significant 2 months post-operatively (p = 0.009). The most common infection was bacterial and OKT3 had more bacterial infections than any other agent. Daclizumab had more patients remain infection free in the first year (p = 0.02), having no fungal infections and a low rate of viral infections. No patient receiving daclizumab developed drug specific side-effects. Only those patients with episodes of acute rejection developed BOS. There were no significant differences in the freedom from acute rejection or BOS between the groups. The 2-year survival for the entire cohort was 68%, with no differences observed in patient survival.nnnCONCLUSIONSnThis study again reveals the importance of acute rejection in the subsequent development of BOS. Although daclizumab offers a low risk of post-transplant infection and drug specific side-effects, no drug is superior in delaying rejection or BOS or in prolonging long-term survival.
Journal of Heart and Lung Transplantation | 2009
Christian A. Merlo; Eric S. Weiss; Jonathan B. Orens; Marvin C Borja; Marie Diener-West; John V. Conte; Ashish S. Shah
BACKGROUNDnThe Lung Allocation Score (LAS) dramatically changed organ allocation in lung transplantation. The impact of this change on patient outcomes is unknown. The purpose of the study was to examine early mortality after lung transplantation under the LAS system.nnnMETHODSnAll patients undergoing first-time lung transplantation during the period from May 1, 2005 through April 30, 2008 were included in the study. The cohort was divided into quintiles by LAS. A high-risk group (LAS >46) was comprised of the highest quintile, Quintile 5, and a low-risk group (LAS < or =46) included the lower quintiles, Quintiles 1 through 4. A time-to-event analysis was performed for risk of death after transplantation using Kaplan-Meier survival and Cox proportional hazards models.nnnRESULTSnThere were 4,346 patients who underwent lung transplantation during the study period. Patients in the high-risk group (LAS >46) were more likely to have idiopathic pulmonary fibrosis (IPF; 52.9% vs 23.8%, p < 0.001) and diabetes (25.8% vs 16.8%, p < 0.001) and to require mechanical ventilatory support (15.4% vs 2.2%, p < 0.001) at the time of transplant as compared with patients in the low-risk group. One-year survival using the Kaplan-Meier product limit estimator was significantly worse in the high-risk group (75% vs 83%, p < 0.001 by log-rank test). Patients in the high-risk group were also found to have increased risk of death (hazard ratio 1.46, 95% confidence interval 1.24 to 1.73) compared with the low-risk group.nnnCONCLUSIONSnOverall 1-year survival under the new LAS system appears to be similar to that in historic reports. However, risk of death was significantly increased among patients with LAS >46.
Journal of Heart and Lung Transplantation | 2007
Eric S. Weiss; Lois U. Nwakanma; Stuart B. Russell; John V. Conte; Ashish S. Shah
BACKGROUNDnDespite 40 years of heart transplantation, the optimal atrial anastomotic technique remains unclear. The United Network for Organ Sharing (UNOS) database provides a unique and novel opportunity to address this question by examining survival in a large cohort of patients undergoing orthotopic heart transplantation (OHT). We hypothesized that, when examining the issue on a large scale, no difference in survival would exist between techniques.nnnMETHODSnWe retrospectively reviewed first-time adult OHT in the UNOS database to identify 14,418 patients undergoing OHT between the years 1999 and 2005. Primary stratification was between those who underwent bicaval vs biatrial techniques. Baseline demographic and clinical factors were also recorded. The primary end-point was mortality from all causes during the study period. Secondary outcomes included length of hospital stay (LOS), and need for permanent pacemaker placement (PP). Post-transplant survival was compared between groups using a Cox proportional hazard regression model.nnnRESULTSnOf the 11,931 patients who met inclusion criteria between 1999 and 2005, 5,207 (44%) underwent the bicaval anastomotic technique. Bicaval and biatrial groups were well matched for gender, donor age, ischemic time, pulmonary vascular resistance, transpulmonary gradient, cardiac index, body mass index and pre-operative creatinine. Technique was not associated with survival during the study period (hazard ratio 1.06, p = 0.31). On multivariate analysis, age, gender, donor age and ischemic time were independent predictors of mortality. The bicaval technique was associated with less need for post-operative PP (2.0% vs 5.3%, p < 0.001) and shorter LOS (19 vs 21 days, p < 0.001).nnnCONCLUSIONSnThis study is the single largest series examining bicaval vs biatrial anastamotic techniques for OHT. We found no difference in survival between the two groups, although the bicaval technique was associated with shorter LOS and pacemaker placement. Both techniques lead to equivalent survival in OHT.
Catheterization and Cardiovascular Interventions | 2006
Andrew D. Atiemo; John V. Conte; Alan W. Heldman
A number of techniques have been proposed for circulatory support in patients with severe right heart failure. We report on a patient with right ventricular (RV) infarction complicated by cardiogenic shock, who was resuscitated by the novel use of a percutaneous right ventricular assist device. The moribund patient had striking hemodynamic and clinical improvement after placement of a right atrial to pulmonary artery assist circuit. RV decompression for 3 days was accompanied by recovery of contractile function, and the patient was successfully weaned from the device. Future trials are needed to assess the effectiveness of the percutaneous ventricular assist device in patients with RV failure.
American Journal of Cardiology | 2002
Patricia P. Chang; Marc Sussman; John V. Conte; Maura A. Grega; Steven P. Schulman; Gary Gerstenblith; Nae Yuh Wang; Anne Capriotti; James L. Weiss
I t is not known whether the extent and time course of myocardial dysfunction and recovery differ between patients undergoing off-pump coronary artery bypass surgery (CABG) and on-pump CABG. We conducted a nonrandomized pilot study to assess myocardial stunning and the time course of functional recovery in patients undergoing offand on-pump CABG. Ventricular function was assessed using myocardial contrast echocardiography because contrast-enhanced images provide better endocardial border definition and patients who have undergone cardiac surgery generally have limited acoustic windows after operation. • • • Adult patients who were eligible for both offand on-pump CABG were recruited for this study. The type of CABG was determined before study recruitment, and 2 surgeons performed all operations. Patients were excluded if they were considered ineligible for off-pump CABG (e.g. hemodynamic instability, target distal vessels 1.5 mm, vessels inadequately visualized, or heavily calcified vessels), or if they were only suited for off-pump CABG (e.g. severe lung, renal, or aortic disease). Patients were also excluded if they had undergone prior CABG, had hemodynamic instability, had associated severe cardiac or medical conditions, could not return for follow-up 3 to 4 weeks after surgery, or had prestudy echocardiographic images that were technically unsatisfactory for detailed assessment of left ventricular (LV) function. The first 12 qualifying patients were enrolled in the study: 6 underwent offand 6 on-pump CABG. Both groups of patients received the same, standardized, intraoperative and perioperative care. In patients who had undergone conventional CABG, myocardial preservation during cardiopulmonary bypass (uniformly blood plegia), topical cold drip, and aortic cross clamping were standardized. Electrocardiographic ST-segment changes were monitored with electrocardiographic recordings after operation. The outcomes for this analysis were LV function measured by ejection fraction (EF) and assessed using myocardial contrast echocardiography, and biochemical markers of procedural ischemic damage measured with cardiac enzyme markers including creatinine kinase (CK), the MB isoform of CK (CK-MB), and troponin I. Cardiac enzymes were measured before, and at 6, 12, and 24 hours after operation. Patients underwent a total of 4 contrast-enhanced echocardiographic studies: before, 1 day after, 3 days after, and 1 month after operation. Because of the fresh sternal surgical wound, only apical windows (4and 2-chamber views) were available for echocardiographic evaluation. After baseline precontrast echocardiographic images were acquired with standardized parameters, Optison contrast (3 ml) (Mallinckrodt, Inc., Hazelwood, Missouri) was infused intravenously at a rate of 80 ml/hour, and later adjusted as needed to obtain optimal myocardial opacification. Two observers, blinded to patient identification, the stage of the study, and each other’s interpretation, read each echocardiographic study to assess global LV function and to score 5 anatomic regions (anterior, apex, lateral, inferior, and septum) with respect to regional LV function before and after contrast enhancement using the semiquantitative methods recommended by the American Society of Echocardiography. Primary image analysis was limited to the contrast-enhanced images. Final scoring of images was the average of the EF estimates from the 2 observers. In the event the discrepancy between absolute EF estimates was 10%, a final estimate was obtained from a consensus reading by the blinded observers. Between-group differences in EF and cardiac enzymes were compared using analysis of variance for continuous variables with a normal distribution, and Wilcoxon’s rank-sum test for those with a non-normal distribution. The changes over time in EF and enzymes were compared between groups using repeatedmeasures regression. All scores from the 2 observers for echocardiographic image analyses were tested for interobserver variability using intraclass correlation and tests. Changes in baseline-to-peak enzyme level were compared with changes in preoperative-to-postoperative EF in each CABG group, and changes in regional wall motion were compared among wall segments, with correlation tests (Spearman’s ). Alpha levels of 0.05 based on a 2-tailed test were used to define statistical significance. The median age was 73 years in the group who had off-pump CABG and 58 years in the on-pump CABG group (Wilcoxon p 0.03) (Table 1). Off-pump From the Departments of Medicine and Surgery, The Johns Hopkins University School of Medicine, Baltimore, Maryland. This study was supported by Grant M01-RR00052 from the National Institutes of Health/National Center for Research Resources, Bethesda, Maryland. Dr. Chang’s address is: Division of Cardiology, Carnegie 568, The Johns Hopkins Hospital, 600 North Wolfe Street, Baltimore, Maryland 21287. E-mail: [email protected]. Manuscript received October 19, 2001; revised manuscript received and accepted January 24, 2002.
The Annals of Thoracic Surgery | 2015
Daijiro Hori; Masahiro Ono; Thomas Rappold; John V. Conte; Ashish S. Shah; Duke E. Cameron; Hideo Adachi; Allen D. Everett; Charles W. Hogue
BACKGROUNDnIndividualizing blood pressure targets could improve organ perfusion compared with current practices. In this study we assess whether hypotension defined by cerebral autoregulation monitoring vs standard definitions is associated with elevation in the brain-specific injury biomarker glial fibrillary acidic protein plasma levels (GFAP).nnnMETHODSnPlasma GFAP levels were measured in 121 patients undergoing cardiac operations after anesthesia induction, at the conclusion of the operation, and on postoperative day 1. Cerebral autoregulation was monitored during the operation with the cerebral oximetry index, which correlates low-frequency changes in mean arterial pressure (MAP) and regional cerebral oxygen saturation. Blood pressure was recorded every 15 minutes in the intensive care unit. Hypotension was defined based on autoregulation data as an MAP below the optimal MAP (MAP at the lowest cerebral oximetry index) and based on standard definitions (systolic blood pressure decrement >20%, >30% from baseline, or <100 mm Hg, or both).nnnRESULTSnMAP (mean ± standard deviation) in the intensive care unit was 74 ± 7.3 mm Hg; optimal MAP was 78 ± 12.8 mm Hg (pxa0= 0.008). The incidence of hypotension varied from 22% to 37% based on standard definitions but occurred in 54% of patients based on the cerebral oximetry index (p < 0.001). There was no relationship between standard definitions of hypotension and plasma GFAP levels, but MAP of less than optimal was positively related with postoperative day 1 GFAP levels (coefficient, 1.77; 95% confidence interval, 1.27 to 2.48; pxa0= 0.001) after adjusting for GFAP levels at the conclusion of the operation and low cardiac output syndrome.nnnCONCLUSIONSnIndividualizing blood pressure management using cerebral autoregulation monitoring may better ensure brain perfusion than current practice.
The Annals of Thoracic Surgery | 2012
Timothy J. George; Claude A. Beaty; Gregory A. Ewald; Stuart D. Russell; Ashish S. Shah; John V. Conte; Glenn J. Whitman; Scott C. Silvestry
BACKGROUNDnAlthough several studies have examined factors affecting survival after orthotopic heart transplantation (OHT), few have evaluated the impact of reoperative sternotomy. We undertook this study to examine the incidence and impact of repeat sternotomies on OHT outcomes.nnnMETHODSnWe conducted a retrospective review of all adult OHT from 2 institutions. Primary stratification was by the number of prior sternotomies. The primary outcome was survival. Secondary outcomes included blood product utilization and commonly encountered postoperative complications. Multivariable Cox proportional hazards regression models examined mortality while linear regression models examined blood utilization.nnnRESULTSnFrom January 1995 to October 2011, 631 OHT were performed. Of these, 25 (4.0%) were redo OHT and 182 (28.8%) were bridged to transplant with a ventricular assist device; 356 (56.4%) had undergone at least 1 prior sternotomy. On unadjusted analysis, reoperative sternotomy was associated with decreased 90-day (98.5% vs 90.2%, p<0.001), 1-year (93.1% vs 79.6%, p<0.001), and 5-year (80.4% vs 70.1%, p=0.002) survival. This difference persisted on multivariable analysis at 90 days (hazard ratio [HR] 2.99, p=0.01), 1 year (HR 2.98, p=0.002), and 5 years (HR 1.62, p=0.049). The impact of an increasing number of prior sternotomies was negligible. On multivariable analysis, an increasing number of prior sternotomies was associated with increased intraoperative blood product utilization. Increasing blood utilization was associated with decreased 90-day, 1-year, and 5-year survival.nnnCONCLUSIONSnReoperative sternotomy is associated with increased mortality and blood utilization after OHT. Patients with more than 1 prior sternotomy do not experience additional increased mortality. Carefully selected patients with multiple prior sternotomies have decreased but acceptable outcomes.
The Journal of Thoracic and Cardiovascular Surgery | 2016
J. Trent Magruder; Allen Young; Joshua C. Grimm; John V. Conte; Ashish S. Shah; Kaushik Mandal; Christopher M. Sciortino; Kenton J. Zehr; Duke E. Cameron; Joel Price
BACKGROUNDnDespite evidence that bilateral internal thoracic arteries (ITAs) improve long-term survival after coronary artery bypass grafting (CABG), uptake of this technique remains low. We directly compared bilateral ITA graft configurations and examined long-term outcomes.nnnMETHODSnWe reviewed 762 patients who underwent CABG using bilateral ITA grafts at our institution between 1997 and 2014. The outcomes were mortality and a composite revascularization end point defined as need for percutaneous coronary intervention or repeat CABG. Adjusted subgroup analyses were performed using propensity score-adjusted Cox proportional hazards modeling.nnnRESULTSnThe cohort was divided into 4 groups: in situ (left ITA [LITA] anastomosed to the left anterior descending artery [LAD] with in situ right ITA [RITA] anastomosed to the left coronary circulation [239 patients]); in situ LITA-LAD and in situ RITA-right coronary circulation (239 patients); in situ RITA-LAD with in situ LITA-left coronary circulation (185 patients); and in situ LITA-LAD with a free RITA as a composite graft with inflow from the LITA or a saphenous vein graft (99 patients). Over a median follow-up of 1128xa0days, there were 47 deaths, 58 late percutaneous coronary interventions, and 7 repeat CABG procedures. Unadjusted Kaplan-Meier analysis revealed a difference in need for repeat revascularization among the 4 groups (log rank Pxa0=xa0.049). However, after statistical adjustment, graft configuration was not an independent predictor of repeat revascularization or death.nnnCONCLUSIONSnBilateral ITA graft configuration has no independent effect on need for repeat revascularization or long-term survival. Therefore, the simplest technique, determined by individual patient characteristics, should be selected.