Markus Kamler
University of Duisburg-Essen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Markus Kamler.
The Annals of Thoracic Surgery | 2008
Heinz Jakob; Konstantinos Tsagakis; Paschalis Tossios; Parwis Massoudy; Matthias Thielmann; Thomas Buck; Holger Eggebrecht; Markus Kamler
BACKGROUND To possibly prevent late complications after classic type A aortic dissection repair, the radical concept of ascending/arch replacement with simultaneous antegrade descending stent grafting using a hybrid prosthesis was applied and compared with conventional repair leaving the distal false lumen untreated. METHODS Between January 2001 and October 2007, of 71 consecutive patients with acute type A aortic dissection (AAAD), 45 had DeBakey type I dissection and underwent emergency surgery within 24 hours after onset of symptoms. These patients were separated into group 1 (n = 23) undergoing conventional surgery, and group 2 (n = 22) undergoing combined repair with antegrade stent grafting. RESULTS Patients were comparable for baseline characteristics, but more group 2 patients had severely compromised hemodynamics (p = 0.05) and cerebral malperfusion at arrival (p < 0.01). Intraoperative and postoperative characteristics were similar, with an overall hospital mortality of 16% (5 [22%] versus 2 [9%], group 1 versus group 2; p = 0.22). At a mean follow-up time of 48 months for group 1 versus 23 months for group 2 (p < 0.01), late mortality did not differ between groups (p = 0.38) and was mainly related to additional surgical procedures and persisting neurologic sequelae and not to the aortic pathology. Persisting distal false lumen patency was observed in 89% of group 1 versus 10% of group 2 patients (p < 0.01). CONCLUSIONS This hybrid approach to patients with type I acute aortic dissection is technically feasible without increasing the operative risk and offers the chance of persistent occlusion of the persistent graft distal false lumen.
The Annals of Thoracic Surgery | 2009
Matthias Thielmann; Daniel Wendt; Holger Eggebrecht; Philipp Kahlert; Parwis Massoudy; Markus Kamler; Raimund Erbel; Heinz Jakob; Stefan Sack
BACKGROUND We sought to determine whether transcatheter aortic valve implantation is a reasonable treatment option in patients with a very or extremely high risk for conventional aortic valve replacement, presenting with a logistic EuroSCORE greater than 30% or a Society of Thoracic Surgeons score greater than 15%. METHODS Between May 2005 and November 2008, 39 of 85 transcatheter aortic valve implantation patients with a very high risk for aortic valve replacement underwent either transfemoral (n = 15) or transapical (n = 24) transcatheter aortic valve implantation with a mean estimated logistic EuroSCORE of 44.2% +/- 12.6% (mean +/- standard deviation) and a Society of Thoracic Surgeons score of 17.9% +/- 6.1%. Transcatheter aortic valve implantation was performed in a hybrid operative theater using the Cribier-Edwards or Edwards SAPIEN prosthesis. RESULTS Valve implantation was successful in 97% of the patients. Operative mortality was 2.6%, and mortality at 30 days was 17.9%. After valve implantation, hemodynamic improvement was assessed by decreased mean pressure gradient (p < 0.001) and increased aortic valve area (p < 0.001), accompanied by improved New York Heart Association functional status (p < 0.01). Actuarial survival was 74.4% at 3 months, 74.4% at 6 months, and 64.1% at 12 months of follow-up. Echocardiography revealed aortic regurgitation in 58% of the patients during hospital stay, 43% at 6 months of follow-up, and 40% at 12 months of follow-up, but no structural valve deterioration could be observed during the complete follow-up period. CONCLUSIONS Transcatheter aortic valve implantation in patients with severe aortic stenosis and a very high risk for aortic valve replacement is feasible and may be a reasonable treatment option in these patients.
Circulation | 2006
Matthias Thielmann; Parwis Massoudy; Markus Neuhäuser; Konstantinos Tsagakis; Günter Marggraf; Markus Kamler; Klaus Mann; Raimund Erbel; Heinz Jakob
Background— Cardiac troponin I (cTnI) is a highly sensitive and specific biomarker which has been shown to predict patient outcome pre- and postoperatively following elective coronary artery bypass surgery (CABG). Whether preoperatively elevated cTnI levels similarly predict the outcome in patients undergoing emergency CABG with acute myocardial infarction (AMI) is currently unknown. Methods and Results— A possible correlation between preoperative cTnI and in-hospital mortality and major adverse cardiac events (MACE) was investigated in 57 patients with ST-elevation AMI (STEMI) in group 1 and 197 with Non-ST-elevation AMI (NSTEMI) in group 2, who were operated within 24 hours after onset of symptoms. Primary study end point was all-cause in-hospital mortality. Secondary end points were low cardiac output syndrome (LCOS) and hospital course. CTnI levels on admission were higher in group 1 compared with group 2 (7.1±1.8 versus 1.4±1.8 ng/mL; P<0.001). Overall in-hospital mortality was higher in group 1 compared with group 2 (14.3 versus 4.1%; odds ratio [OR], 3.9, 95% confidence interval [CI], 1.3 to 12.3; P<0.01). LCOS occurred in 16/57 (28.1%), and 18/197 (9.1%) patients, respectively (OR, 3.9, 95% CI, 1.7 to 8.8; P<0.001). Postoperative ventilation time, intensive care, and hospital stay were significantly longer in group 1 versus group 2. Multivariate logistic regression analyses revealed preoperative cTnI as the strongest independent predictor for in-hospital mortality (P<0.001) and MACE (P<0.001) in all AMI patients, regardless whether ST-elevation was included as an additional risk factor or not. Conclusions— Preoperative cTnI measurement before emergency CABG appears as a powerful and independent determinant of in-hospital mortality and MACE in acute STEMI and NSTEMI.
Circulation | 2006
Matthias Thielmann; Rainer Leyh; Parwis Massoudy; Markus Neuhäuser; I. Aleksic; Markus Kamler; Ulf Herold; Jarowit Piotrowski; Heinz Jakob
Background— A possible relationship between increased perioperative risk during coronary artery bypass grafting (CABG) and previous percutaneous coronary intervention (PCI) is debatable. We sought to determine the impact of previous PCI on patient outcome after elective CABG. Methods and Results— Between January 2000 and January 2005, 2626 consecutive patients undergoing first-time isolated elective CABG as the primary revascularization procedure (group 1) were evaluated for in-hospital mortality and major adverse cardiac events (MACEs) and were compared with 360 patients after single PCI (group 2) and with 289 patients after multiple PCI sessions (group 3) before elective CABG. Unadjusted univariate and risk-adjusted multivariate logistic-regression analysis revealed previous multiple PCIs to be strongly associated with in-hospital mortality (odds ratio [OR], 2.24; 95% confidence interval [CI], 1.52 to 3.21; P<0.001) and MACEs (OR, 2.28; 95% CI, 1.38 to 3.59; P<0.001). To control for selection bias, a computed propensity-score matching based on 13 patient characteristics and preoperative risk factors was performed separately comparing group 1 versus 2 and group 1 versus 3. After propensity matching, conditional logistic-regression analysis confirmed previous multiple PCIs to be strongly associated with in-hospital mortality (OR, 3.01; 95% CI, 1.51 to 5.98; P<0.0017) and MACEs (OR, 2.31; 95% CI, 1.45 to 3.67; P<0.0004). Conclusions— In patients with a history of multiple PCI sessions, perioperative risk for in-hospital mortality and MACEs during subsequent elective CABG is increased.
European Journal of Cardio-Thoracic Surgery | 2008
Daniel Wendt; Matthias Thielmann; Thomas Buck; Rolf-Alexander Jánosi; Torsten Bossert; Nikolaus Pizanis; Markus Kamler; Heinz Jakob
BACKGROUND Aortic valve replacement (AVR) with extracorporeal circulation (ECC) is currently the treatment of choice for symptomatic aortic stenosis. However, patients with multiple high-risk comorbid conditions may benefit from reduced ECC time and thus, reduced myocardial ischemia, by the use of sutureless AVR. We describe the initial experience and 1-year results of our first 3F-Enable AVR implants. METHODS Between 09/05 and 12/05, six patients (age 74+/-1.8 years; three females) with symptomatic aortic stenosis (NYHA III) underwent AVR with an equine pericardial and nitinol-stented sutureless prosthesis. For additional safety up to three stay sutures were placed. Echocardiography was performed preoperatively, intraoperatively, at 6- and 12-month follow-up. Clinical data, adverse events and patient outcome were recorded prospectively. RESULTS Prosthesis sizes were 27 mm (n=3), 25 mm (n=1), 23 mm (n=1) and 21 mm (n=1). ECC time was 87+/-32 min; aortic clamp time was 56+/-24 min. Prosthesis deployment time was 148 +/- 173 s. There were no intraoperative deaths or complications. At 12-month follow-up mean pressure gradients (MPG) were 6.8+/-3.5 mmHg and aortic valve area (AVA) was 2.2 +/- 0.5 cm(2). One patient underwent successful redo AVR after 8 months due to severe paravalvular leakage (PVL), and one patient died due to lung cancer 10 months after surgery. At 12 months follow-up four out of six patients are alive and asymptotic (NYHA I) with the 3F-Enable aortic valve prosthesis, however, one patient showed mild paravalvular leakage. CONCLUSIONS These first 1-year follow-up data suggest the feasibility of this new concept of sutureless aortic valve implantation. However, severe aortic insufficiency at 8 months and paravalvular leakage at 1-year follow-up should prompt further procedural and device enhancements.
Chest | 2005
Matthias Thielmann; Parwis Massoudy; Markus Neuhäuser; Stephan Knipp; Markus Kamler; Jarowit Piotrowski; Klaus Mann; Heinz Jakob
STUDY OBJECTIVES Elevated levels of cardiac troponin I (cTnI) have been associated with adverse short-term and long-term outcomes in acute coronary syndrome (ACS) patients and in patients who underwent coronary artery bypass grafting (CABG); however, the prognostic implications of preoperative cTnI determination have not been investigated so far. DESIGN AND SETTING Retrospective study in a department of cardiothoracic surgery of a university hospital. PATIENTS AND METHODS A possible correlation between preoperative cTnI levels and major adverse cardiac events (MACE) and in-hospital mortality in CABG patients with non-ST-segment elevation ACS (NSTE-ACS) was investigated. cTnI was determined in 1,978 of 3,124 consecutive CABG patients. Among these, 1,592 patients had preoperative cTnI levels < 0.1 ng/mL and therefore served as control subjects (group 1), 265 patients had NSTE-ACS with cTnI levels from 0.11 to 1.5 ng/mL (group 2), and 121 patients had NSTE-ACS with cTnI levels > 1.5 ng/mL (group 3). cTnI levels, clinical data, MACE, and in-hospital mortality were recorded prospectively. Logistic regression and receiver operating characteristic analyses were applied to determine prognostic cutoff values of cTnI. RESULTS Perioperative myocardial infarction was found in 5.8% of the patients in group 1, 8.3% of the patients in group 2 (odds ratio [OR], 1.5; 95% confidence interval [CI], 0.9 to 2.5), and 18.2% patients in group 3 (OR, 3.6; 95% CI, 2.1 to 6.2; p < 0.0001, Cochran-Armitage trend test). Low cardiac output syndrome occurred in 1.5% of patients in group 1, 4.2% of patients in group 2 (OR, 2.8; 95% CI, 1.3 to 6.1), and 10.9% patients in group 3 (OR, 6.5; 95% CI, 2.9 to 14.4; p < 0.0001). In-hospital mortality was 1.5% in group 1, 3.0% in group 2 (OR, 2.0; 95% CI, 0.8 to 4.8), but 6.6% in group 3 (OR, 4.6; 95% CI, 1.9 to 11.1; p < 0.0001). Univariate and multivariate logistic regression analyses identified cTnI as the strongest preoperative predictor for MACE and in-hospital mortality, respectively. CONCLUSIONS Preoperative cTnI measurement before CABG appears as a powerful and independent determinant of short-term surgical risk in patients with NSTE-ACS.
European Journal of Cardio-Thoracic Surgery | 2010
Nikolaus Pizanis; Jens Heckmann; Konstantinos Tsagakis; Paschalis Tossios; Parwis Massoudy; Daniel Wendt; Heinz Jakob; Markus Kamler
OBJECTIVES Lung organ scarcity has led to more generous acceptance of organs under the idea of extended-donor criteria. However, long-term effects have to be monitored to redefine present practice. In this study, we investigated the impact of donor age over 55 years in lung transplantation. METHODS In this retrospective study, 186 consecutive double-lung transplantation procedures from January 2000 to December 2008 were evaluated. A total of 19 recipients received lungs from donors aged 55 years or older (range 55-69 years) (group A) and 167 received lungs from younger donors (range 8-54) (group B). In-hospital mortality, intensive care unit (ICU) stay, rejection episodes, lung function and survival up to 5 years were evaluated. RESULTS In-hospital mortality was similar in both groups (group A: 10.5%; group B: 13.7%). Postoperative ICU stay was 19+/-33 days versus 17+/-34 days (A vs B). Rejection episodes as well as postoperative lung function up to 5 years, and overall cumulative 5-year survival (group A: 52.4%; group B: 50.9%) did not reach statistical significance. However, a trend of increased bronchiolitis obliterans syndrome (BOS) prevalence and reduced lung function was noted. Cause of death showed no differences in both groups. CONCLUSIONS Donor age > or =55 years does not compromise immediate and long-term results after lung transplantation, although long-term observation of patients receiving such an organ suggests earlier lung dysfunction. Due to the rising need of organs, lungs from donors aged 55 or older have to be considered for transplantation. However, the acceptance should be based on donor lung evaluation and individual recipient needs. Long-term outcomes over 5 years need to be further investigated.
Journal of the American College of Cardiology | 2015
Masahide Harada; Artavazd Tadevosyan; Xiao-Yan Qi; Jiening Xiao; Tao Liu; Niels Voigt; Matthias Karck; Markus Kamler; Itsuo Kodama; Toyoaki Murohara; Dobromir Dobrev; Stanley Nattel
BACKGROUND Atrial fibrillation (AF) is associated with metabolic stress, which activates adenosine monophosphate-regulated protein kinase (AMPK). OBJECTIVES This study sought to examine AMPK response to AF and associated metabolic stress, along with consequences for atrial cardiomyocyte Ca(2+) handling. METHODS Calcium ion (Ca(2+)) transients (CaTs) and cell shortening (CS) were measured in dog and human atrial cardiomyocytes. AMPK phosphorylation and AMPK association with Ca(2+)-handling proteins were evaluated by immunoblotting and immunoprecipitation. RESULTS CaT amplitude and CS decreased at 4-min glycolysis inhibition (GI) but returned to baseline at 8 min, suggesting cellular adaptation to metabolic stress, potentially due to AMPK activation. GI increased AMPK-activating phosphorylation, and an AMPK inhibitor, compound C (CompC), abolished the adaptation of CaT and CS to GI. The AMPK activator 5-aminoimidazole-4-carboxamide ribonucleotide (AICAR) increased CaT amplitude and CS, restoring CompC-induced CaT and CS decreases. CompC decreased L-type calcium channel current (ICa,L), along with ICa,L-triggered CaT amplitude and sarcoplasmic reticulum (SR) Ca(2+) content under voltage clamp conditions in dog cells and suppressed CaT and ICa,L in human cardiomyocytes. Small interfering ribonucleic acid-based AMPK knockdown decreased CaT amplitude in neonatal rat cardiomyocytes. L-type Ca(2+) channel α subunits coimmunoprecipitated with AMPKα. Atrial AMPK-activating phosphorylation was enhanced by 1 week of electrically maintained AF in dogs; fractional AMPK phosphorylation was increased in paroxysmal AF and reduced in longstanding persistent AF patients. CONCLUSIONS AMPK is activated by metabolic stress and AF, and helps maintain the intactness of atrial ICa,L, Ca(2+) handling, and cell contractility. AMPK contributes to the atrial compensatory response to AF-related metabolic stress; AF-related metabolic responses may be an interesting new therapeutic target.
European Journal of Cardio-Thoracic Surgery | 2011
Kevin Pilarczyk; Brigitte R. Osswald; Nikolaus Pizanis; Konstantinos Tsagakis; Parwis Massoudy; Jens Heckmann; Heinz Jakob; Markus Kamler
OBJECTIVES Shortage of donors is one of the major limitations in lung transplantation (LuTX) and an aggressive expansion of criteria for donor selection has been proposed. This study evaluates the outcome of recipients of pulmonary grafts coming from resuscitated donors when compared with recipients of non-resuscitated donors. METHODS We retrospectively analyzed the donor and recipient charts of all double LuTX performed at our institution between 2000 and 2008 with regard to the performance of donor-cardiopulmonary resuscitation (CPR). RESULTS Out of 186 eligible transplants, 22 patients (11.8%) received lungs from donors who have suffered cardiac arrest (CA) and subsequent CPR. Mean duration of CPR was 15.2 ± 11.3 min. Terminal laboratory profiles of CPR donors and non-CPR donors were similar as were ventilation time and paO(2)/FiO(2) ratio before organ harvesting or chest X-ray. CPR-donor status did not affect the following indices of graft function: length of postoperative ventilation, paO(2)/FiO(2) ratio up to 48 h and lung function up to 60 months. Length of intensive care and hospital stay, need for inotropic support and 30-day mortality were not significantly different for the transplantation of CPR or no-CPR donor lungs. One- and 3-year survival rates were comparable as well with 84.4% and 66.3% for CPR donors versus 88.5% and 69.8% no-CPR donors. CONCLUSIONS This study indicates that transplantation of lungs from resuscitated donors may not affect outcome after LuTX. Therefore, donor history of CA should not automatically preclude LuTX.
European Journal of Cardio-Thoracic Surgery | 2011
Konstantinos Tsagakis; Paschalis Tossios; Markus Kamler; Jaroslav Benedik; Dorgam Natour; Holger Eggebrecht; Jarowit Piotrowski; Heinz Jakob
OBJECTIVE The DeBakey classification was used to discriminate the extent of acute aortic dissection (AD) and was correlated to long-term outcome and re-intervention rate. A slight modification of type II subgroup definition was applied by incorporating the aortic arch, when full resectability of the dissection process was given. METHODS Between January 2001 and March 2010, 118 patients (64% male, mean age 59 years) underwent surgery for acute AD. As many as 74 were operated on for type I and 44 for type II AD. Complete resection of all entry sites was performed, including antegrade stent grafting for proximal descending lesions. RESULTS Patients were comparable with respect to demographics and preoperative hemodynamic status. They underwent isolated ascending replacement, hemiarch, or total arch replacement in 7%, 26%, and 67% in type I, versus 27%, 37%, and 36% in type II, respectively. Additional descending stent grafting was performed in 33/74 (45%) type I patients. In-hospital mortality was 14%, 16% (12/74) in type I versus 9% (4/44, type II), p=0.405. After 5 years, the estimated survival rate was 63% in type I versus 80% in type II, p=0.135. In type II, no distal aortic re-intervention was required. In type I, the freedom of distal re-interventions was 82% in patients with additional stent grafting versus 53% in patients without, p=0.022. CONCLUSIONS The slightly modified DeBakey classification exactly reflects late outcome and aortic re-intervention probability. Thus, in type II patients, the aorta seems to be healed without any probability of later re-operation or re-intervention.