Stanislao Morgera
Charité
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stanislao Morgera.
Clinical Journal of The American Society of Nephrology | 2007
Sean M. Bagshaw; Shigehiko Uchino; Rinaldo Bellomo; Hiroshi Morimatsu; Stanislao Morgera; Miet Schetz; Ian Tan; Catherine S. C. Bouman; Ettiene Macedo; Noel Gibney; Ashita Tolwani; Heleen M. Oudemans-van Straaten; Claudio Ronco; John A. Kellum
Sepsis is the most common cause of acute kidney injury (AKI) in critical illness, but there is limited information on septic AKI. A prospective, observational study of critically ill patients with septic and nonseptic AKI was performed from September 2000 to December 2001 at 54 hospitals in 23 countries. A total of 1753 patients were enrolled. Sepsis was considered the cause in 833 (47.5%); the predominant sources of sepsis were chest and abdominal (54.3%). Septic AKI was associated with greater aberrations in hemodynamics and laboratory parameters, greater severity of illness, and higher need for mechanical ventilation and vasoactive therapy. There was no difference in enrollment kidney function or in the proportion who received renal replacement therapy (RRT; 72 versus 71%; P = 0.83). Oliguria was more common in septic AKI (67 versus 57%; P < 0.001). Septic AKI had a higher in-hospital case-fatality rate compared with nonseptic AKI (70.2 versus 51.8%; P < 0.001). After adjustment for covariates, septic AKI remained associated with higher odds for death (1.48; 95% confidence interval 1.17 to 1.89; P = 0.001). Median (IQR) duration of hospital stay for survivors (37 [19 to 59] versus 21 [12 to 42] d; P < 0.0001) was longer for septic AKI. There was a trend to lower serum creatinine (106 [73 to 158] versus 121 [88 to 184] mumol/L; P = 0.01) and RRT dependence (9 versus 14%; P = 0.052) at hospital discharge for septic AKI. Patients with septic AKI were sicker and had a higher burden of illness and greater abnormalities in acute physiology. Patients with septic AKI had an increased risk for death and longer duration of hospitalization yet showed trends toward greater renal recovery and independence from RRT.
Critical Care Medicine | 2004
Shigehiko Uchino; Gordon S. Doig; Rinaldo Bellomo; Hiroshi Morimatsu; Stanislao Morgera; Miet Schetz; Ian Tan; Catherine S. C. Bouman; Ettiene Macedo; Noel Gibney; Ashita Tolwani; Claudio Ronco; John A. Kellum
Objective:According to recent research, diuretics may increase mortality in acute renal failure patients. The administration of diuretics in such patients has been discouraged. Our objective was to determine the impact of diuretics on the mortality rate of critically ill patients with acute renal failure. Design:Prospective, multiple-center, multinational epidemiologic study. Setting:Intensive care units from 54 centers and 23 countries. Patients:Patients were 1,743 consecutive patients who either were treated with renal replacement therapy or fulfilled predefined criteria for acute renal failure. Interventions:Three distinct multivariate models were developed to assess the relationship between diuretic use and subsequent mortality: a) a propensity score adjusted multivariate model containing terms previously identified to be important predictors of outcome; b) a new propensity score adjusted multivariate model; and c) a multivariate model developed using standard methods, compensating for collinearity. Measurements and Main Results:Approximately 70% of patients were treated with diuretics at study inclusion. Mean age was 68 and mean Simplified Acute Physiology Score II was 47. Severe sepsis/septic shock (43.8%), major surgery (39.1), low cardiac output (29.7), and hypovolemia (28.2%) were the most common conditions associated with the development of acute renal failure. Furosemide was the most common diuretic used (98.3%). Combination therapy was used in 98 patients only. In all three models, diuretic use was not associated with a significantly increased risk of mortality. Conclusions:Diuretics are commonly prescribed in critically ill patients with acute renal failure, and their use is not associated with higher mortality. There is full equipoise for a randomized controlled trial of diuretics in critically ill patients with renal dysfunction.
Intensive Care Medicine | 1997
Peter Heering; Stanislao Morgera; F. J. Schmitz; G. Schmitz; R. Willers; H. P. Schultheiss; Bodo E. Strauer; Bernd Grabensee
Objectives: To determine whether continuous venovenous hemofiltration leads to extraction of tumor necrosis factor alpha (TNFα) and cytokines from the circulation of critically ill patients with sepsis and acute renal failure and to quantitate the clearance and the removal rate of these cytokines and their effect on serum cytokine concentrations. Design: Prospective, controlled study in patients with continuous venovenous hemofiltration (24 l/24 h) using a polysulphone membrane in patients with acute renal failure. Patients: 33 ventilated patients with acute renal failure of septic (n = 18) and cardiovascular origin (n = 15) were studied. Interventions: Hemodynamic monitoring and collection of blood and ultrafiltrate samples before and during the first 72 h of continuous hemofiltration. Measurements and main results: Cardiovascular hemodynamics (Swan-Ganz catheter), Acute Physiology and Chronic Health Evaluation II score, creatinine, electrolytes, and blood urea nitrogen were recorded daily. Cytokines (TNFα, TNFα-RII, interleukin (IL) 1β , IL1RA, IL2, IL2R, IL6, IL6R, IL8, IL10) were measured in prefilter blood and in ultrafiltrate immediately preceding and 12, 24, 48, and 72 h after initiating continuous venovenous hemofiltration (CVVH). Septic patients showed elevated cardiovascular values for cardiac output (7.2 ± 2.1 l/min), cardiac index (4.2 ± 1.3 l/min per m2), and stroke volume (67 ± 23 ml) and reduced values for systemic vascular resistance (540 ± 299 dyn · s · cm− 5). All hemodynamic values normalized within the first 24 h after initiating CVVH treatment. TNFα was 1833 ± 1217 pg/ml in septic patients and 42.9 ± 6.3 pg/ml in nonseptic patients (p < 0.05) prior to CVVH. TNFα was detected in ultrafiltrate but did not decrease in blood during treatment with CVVH. There was no difference in IL 1β between septic (3.8 ± 1.9 pg/ml) and nonseptic patients (1.7 ± 0.5 pg/ml). No significant elimination of cytokines was achieved in the present study by CVVH treatment. Conclusions: These findings demonstrate that CVVH can remove TNFα and special cytokines from the circulation of critically ill patients. Cardiovascular hemodynamics seemed to improve in septic patients after induction of hemofiltration treatment, although there was no evidence that extracorporeal removal of cytokines achieved a reduction in blood levels. The study indicates that low volume continuous hemofiltration with polysulphone membranes in patients with acute renal failure is not able to induce significant removal of cytokines.
Journal of the American College of Cardiology | 2000
Stephan B. Felix; Alexander Staudt; Wolf V. Dörffel; Verena Stangl; Kurt Merkel; Manfred Pohl; Wolf D Döcke; Stanislao Morgera; Hans H. Neumayer; Klaus D. Wernecke; Gerd Wallukat; Karl Stangl; Gert Baumann
OBJECTIVES The objective of our study was to assess the hemodynamic effects of immunoadsorption (IA) and subsequent immunoglobulin G (IgG) substitution in comparison with the effects of conventional medical treatment in patients with dilated cardiomyopathy (DCM). BACKGROUND Various circulating cardiac autoantibodies have been detected among patients suffering from DCM. These antibodies are extractable by IA. METHODS Patients with DCM (n = 18, New York Heart Association III-IV, left ventricular ejection fraction <30%) and who were on stable medication participated in the study. Hemodynamic measurements were performed using a Swan-Ganz thermodilution catheter. The patients were randomly assigned either to the treatment group with IA and subsequent IgG substitution (IA/IgG group, n = 9) or to the control group without IA/IgG (n = 9). In the IA/IgG group, the patients were initially treated in one IA session daily on three consecutive days. After the final IA session, 0.5 g/kg of polyclonal IgG was substituted. At one-month intervals, IA was then repeated for three further courses with one IA session daily on two consecutive days, until the third month. RESULTS After the first IA course and IgG substitution, cardiac index (CI) increased from 2.1 (+/-0.1) to 2.8 (+/-0.1) L/min/m2 (p < 0.01) and stroke volume index (SVI) increased from 27.8 (+/-2.3) to 36.2 (+/-2.5) ml/m2 (p < 0.01). Systemic vascular resistance (SVR) decreased from 1,428 (+/-74) to 997 (+/-55) dyne x s x cm(-5) (p < 0.01). The improvement in CI, SVI and SVR persisted after three months. In contrast, hemodynamics did not change throughout the three months in the control group. CONCLUSIONS Immunoadsorption and subsequent IgG substitution improves cardiovascular function in DCM.
Critical Care | 2005
Christoph Langenberg; Rinaldo Bellomo; Clive N. May; Li Wan; Moritoki Egi; Stanislao Morgera
IntroductionTo assess changes in renal blood flow (RBF) in human and experimental sepsis, and to identify determinants of RBF.MethodUsing specific search terms we systematically interrogated two electronic reference libraries to identify experimental and human studies of sepsis and septic acute renal failure in which RBF was measured. In the retrieved studies, we assessed the influence of various factors on RBF during sepsis using statistical methods.ResultsWe found no human studies in which RBF was measured with suitably accurate direct methods. Where it was measured in humans with sepsis, however, RBF was increased compared with normal. Of the 159 animal studies identified, 99 reported decreased RBF and 60 reported unchanged or increased RBF. The size of animal, technique of measurement, duration of measurement, method of induction of sepsis, and fluid administration had no effect on RBF. In contrast, on univariate analysis, state of consciousness of animals (P = 0.005), recovery after surgery (P < 0.001), haemodynamic pattern (hypodynamic or hyperdynamic state; P < 0.001) and cardiac output (P < 0.001) influenced RBF. However, multivariate analysis showed that only cardiac output remained an independent determinant of RBF (P < 0.001).ConclusionThe impact of sepsis on RBF in humans is unknown. In experimental sepsis, RBF was reported to be decreased in two-thirds of studies (62 %) and unchanged or increased in one-third (38%). On univariate analysis, several factors not directly related to sepsis appear to influence RBF. However, multivariate analysis suggests that cardiac output has a dominant effect on RBF during sepsis, such that, in the presence of a decreased cardiac output, RBF is typically decreased, whereas in the presence of a preserved or increased cardiac output RBF is typically maintained or increased.
Nephrology Dialysis Transplantation | 2009
Sean M. Bagshaw; Shigehiko Uchino; Dinna N. Cruz; Rinaldo Bellomo; Hiroshi Morimatsu; Stanislao Morgera; Miet Schetz; Ian Tan; Catherine S. C. Bouman; Etienne Macedo; Noel Gibney; Ashita Tolwani; Heleen M. Oudemans-van Straaten; Claudio Ronco; John A. Kellum
BACKGROUND The RIFLE classification scheme for acute kidney injury (AKI) is based on relative changes in serum creatinine (SCr) and on urine output. The SCr criteria, therefore, require a pre-morbid baseline value. When unknown, current recommendations are to estimate a baseline SCr by the MDRD equation. However, the MDRD approach assumes a glomerular filtration rate of approximately 75 mL/min/1.73 m(2). This method has not been validated. METHODS Data from the Beginning and Ending Supportive Therapy for the Kidney (BEST Kidney) study, a prospective observational study from 54 ICUs in 23 countries of critically ill patients with severe AKI, were analysed. The RIFLE class was determined by using observed (o) pre-morbid and estimated (e) baseline SCr values. Agreement was evaluated by correlation coefficients and Bland-Altman plots. Sensitivity analysis by chronic kidney disease (CKD) status was performed. RESULTS Seventy-six percent of patients (n = 1327) had a pre-morbid baseline SCr, and 1314 had complete data for evaluation. Forty-six percent had CKD. The median (IQR) values were 97 micromol/L (79-150) for oSCr and 88 micromol/L (71-97) for eSCr. The oSCr and eSCr determined at ICU admission and at study enrolment showed only a modest correlation (r = 0.49, r = 0.39). At ICU admission and study enrolment, eSCr misclassified 18.8% and 11.7% of patients as having AKI compared with oSCr. Exclusion of CKD patients improved the correlation between oSCr and eSCr at ICU admission and study enrolment (r = 0.90, r = 0.84) resulting in 6.6% and 4.0% being misclassified, respectively. CONCLUSIONS While limited, estimating baseline SCr by the MDRD equation when pre-morbid SCr is unavailable would appear to perform reasonably well for determining the RIFLE categories only if and when pre-morbid GFR was near normal. However, in patients with suspected CKD, the use of MDRD to estimate baseline SCr overestimates the incidence of AKI and should not likely be used. Improved methods to estimate baseline SCr are needed.
Critical Care Medicine | 2006
Stanislao Morgera; Michael Haase; Thomas Kuss; Ortrud Vargas-Hein; Heidrun Zuckermann-Becker; Christoph Melzer; Hanno Krieg; Brigitte Wegner; Rinaldo Bellomo; Hans-H. Neumayer
Objective:High cutoff hemofilters are characterized by an increased effective pore size designed to facilitate the elimination of inflammatory mediators in sepsis. Clinical data on this new renal replacement modality are lacking. Design:Prospective, randomized clinical trial. Setting:University hospital, intensive care units. Patients:Thirty patients with sepsis-induced acute renal failure. Intervention:Patients were allocated to high cutoff (n = 20) or conventional (n = 10) hemofiltration in a 2:1 ratio. Median renal replacement dose was 31 mL/kg/hr. For high cutoff hemofiltration, a high-flux hemofilter with an in vivo cutoff point of approximately 60 kilodaltons was used. Conventional hemofiltration was performed with a standard high-flux hemofilter (PF11S). The impacts of high cutoff hemofiltration on the need for norepinephrine and on plasma levels and clearance rates for interleukin (IL)-6 and IL-1 receptor antagonist (IL-1ra) were analyzed. Absolute values, but also adjusted values (expressed as proportion of baseline), were analyzed. The observation period was restricted to 48 hrs. Main Results:Apart from higher antithrombin III levels at entry into the study, main clinical and laboratory parameters were comparable between both groups. The median norepinephrine dose at entry into the study was 0.30 &mgr;g/kg/min in the high cutoff group and 0.21 &mgr;g/kg/min in the conventional hemofiltration group (p = .448). Only the high cutoff group showed a significant decline (p = .0002) in “adjusted” norepinephrine dose over time. Clearance rates for IL-6 and IL-1ra were significantly higher in the high cutoff hemofiltration group (p < .0001), which translated into a significant decline of the corresponding plasma levels (p = .0465 for IL-6; p = .0293 for IL-1ra). Conclusion:In this pilot study, high cutoff hemofiltration has been shown to exert a beneficial effect on the need for norepinephrine in septic patients with acute renal failure. In addition, we demonstrate that high cutoff hemofiltration is superior to conventional hemofiltration in the elimination of IL-6 and IL-1ra from the circulating blood of septic patients.
Critical Care Medicine | 2007
Michael Haase; Anja Haase-Fielitz; Sean M. Bagshaw; Michael C. Reade; Stanislao Morgera; Siven Seevenayagam; George Matalanis; Brian F. Buxton; Laurie Doolan; Rinaldo Bellomo
Objective:To assess the effect of high-dose N-acetylcysteine on renal function in cardiac surgery patients at higher risk of postoperative renal failure. Design:Multiblind, placebo-controlled, randomized, phase II clinical trial. Setting:Operating rooms and intensive care units of two tertiary referral hospitals. Patients:A total of 60 cardiac surgery patients at higher risk of postoperative renal failure. Interventions:Patients were allocated to either 24 hrs of high-dose N-acetylcysteine infusion (300 mg/kg body weight in 5% glucose, 1.7 L) or placebo (5% glucose, 1.7 L). Measurements and Main Results:The primary outcome measure was the absolute change in serum creatinine from baseline to peak value within the first five postoperative days. Secondary outcomes included the relative change in serum creatinine, peak serum creatinine level, serum cystatin C, and in urinary output. Further outcomes were needed for renal replacement therapy, length of ventilation, and length of stay in the intensive care unit and hospital. Randomization was successful and patients were well balanced for preoperative and intraoperative characteristics. There was no significant attenuation in the increase in serum creatinine from baseline to peak when comparing N-acetylcysteine with placebo (64.5 ± 91.2 and 38.0 ± 42.4 &mgr;mol/L, respectively; p = .15). Also, there was no attenuation in the increase in serum cystatin C from baseline to peak for N-acetylcysteine compared with placebo (0.45 ± 0.43 and 0.30 ± 0.33 mg/L, respectively; p = .40). Likewise, there was no evidence for differences in any other clinical outcome. Conclusions:In this phase II, randomized, controlled trial, high-dose N-acetylcysteine was no more effective than placebo in attenuating cardiopulmonary bypass–related acute renal failure in high-risk cardiac surgery patients.
Critical Care Medicine | 2009
Shigehiko Uchino; Rinaldo Bellomo; Hiroshi Morimatsu; Stanislao Morgera; Miet Schetz; Ian Tan; Catherine S. C. Bouman; Ettiene Macedo; Noel Gibney; Ashita Tolwani; Heleen M. Oudemans van Straaten; Claudio Ronco; John A. Kellum
Objectives:To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design:Post hoc analysis of a prospective observational study. Setting:Fifty-four intensive care units in 23 countries. Patients:Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions:None. Measurements and Main Results:Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the “success” group and the rest (216 patients) were classified as the “repeat-RRT” (renal replacement therapy) group. Patients in the “success” group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the “repeat-RRT” group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per &mgr;mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions:We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation of continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics.
Transplantation | 2004
Markus Giessing; Stefan Reuter; Bernd Schönberger; Serdar Deger; Ingolf Tuerk; Ingrid Hirte; Klemens Budde; Lutz Fritsche; Stanislao Morgera; H.-H. Neumayer; Stefan A. Loening
Background. Most studies evaluating the impact of kidney donation on donors’ quality of life (QOL) have limitations such as small cohort size, unmatched references, use of nonstandardized and nonvalidated questionnaires, or low response rates. Methods. We performed a study on donors’ QOL that was designed to avoid these limitations. All available living renal donors in our department in the last 18 years were included in the study. QOL was assessed with two validated, standardized questionnaires (Short Form-36, Giessen Subjective Complaints List [Giessener Beschwerdebogen]-24) and compared with gender- and age-matched references. In addition, specific questions relating to kidney donation were asked. Results. The response rate (89.8%) is one of the highest reported for studies on QOL of living kidney donors. Most donors had an equal or better QOL than the healthy population. Donors’ willingness to donate again (93.4%) or recommend living-donor kidney transplantation (92.4%) was high, irrespective of complications. A small number of donors experienced financial drawbacks or occupational disadvantages. Donors aged 31 to 40 years were found to be at risk of QOL deterioration after organ donation. Donor and recipient complications had a significant impact on donors’ QOL. One third of the donors found that the psychologic care preceding and after kidney donation was insufficient. Conclusions. Our findings support the practice of living-donor kidney transplantation as a good means to meet the persisting organ shortage. Further effort must be put into minimizing donor and recipient complications. The specific demands of younger donors should be further elucidated. In addition to medical follow-up, living kidney donors should also be offered lifelong psychologic counseling.