Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Giancarlo M. Liumbruno is active.

Publication


Featured researches published by Giancarlo M. Liumbruno.


Blood Transfusion | 2010

Red blood cell storage: the story so far

Angelo D'Alessandro; Giancarlo M. Liumbruno; Giuliano Grazzini; Lello Zolla

Red blood cells are still the most widely transfused blood component worldwide and their story is intimately entwined with the history of transfusion medicine and the changes in the collection and storage of blood1,2. At present, the most widely used protocol for the storage of red blood cells (for up to 42 days) is the collection of blood into anticoagulant solutions (typically citrate-dextrose-phosphate); red cell concentrates are prepared by the removal of plasma and, in some cases, also leukoreduction. The product is stored at 4 ± 2° C in a slightly hypertonic additive solution, generally SAGM (sodium, adenine, glucose, mannitol, 376 mOsm/L)1. Despite this, a definitive protocol that reconciles long-term storage on the one hand and safety and efficacy of the transfusion therapy on the other is still the subject of intense debate and discussion. In fact, although the organisation of the blood system, through the achievement of self-sufficiency, currently enables ordinary requests of the transfusion ‘market’ to be met, in the case of a calamity, disaster, or emerging infections3, or in particular periods of the year, local reserves can sometimes reach a minimum. There is still an underlying concern about the real need to store blood components for as long as possible in order to obtain a gradual increase in the interval between the donation and the transfusion, and how much this elastic time span can be prolonged without definitively compromising the quality of the product and, in the final analysis, the recipients’ health2. Indeed, although the transfusion establishment initially pursued both objectives (product quality and prolongation of the storage period), recent retrospective studies (whose results are, therefore, weakened by all the statistical limitations of this type of analysis)5–8 have indicated the apparent irreconcilability of the two aims. These studies seem to suggest that the quality (in terms of safety and efficiency) of red blood cells decreases in proportion to the time the storage period is prolonged. Furthermore, there is extremely convincing molecular evidence9,10 which, together with the results of clinical studies11–33, appears to confirm the preliminary conclusions regarding the likely poorer quality of red blood cells stored for a long time. However, the statistical validity and methodological rigour, in terms of evidence-based medicine, of the clinical studies have recently been challenged, highlighting the need for prospective, double-blind, randomized studies, in like fashion to the one carried out by Walsh et al.34 in 2004, which led the authors to conclude “the data did not support the hypothesis that transfusing red blood cells stored for a long time has detrimental effects on tissue oxygenation in critically ill, anaemic, euvolumaeic patients without active bleeding”. The international scientific community now seems much more convinced of the need of prospective studies, since such studies, on large cohorts of subjects, are currently underway35,36. The key point of the problem is probably the lack of universally accepted standard criteria that closely reflect the dramatic molecular changes that occur during prolonged storage of red blood cells and which, simply put, would enable ‘good’ blood to be distinguished from ‘no longer sufficiently good’ blood. The current standard requirements for patenting new additive solutions in the USA, and also suggested in the recommendations of the European Council37, are essentially based on two parameters: the level of haemolysis (below the threshold of 0.8% at the end of the storage period, following the introduction of the “95/95” rule38) and a survival rate of the transfused cells of more than 75% at 24 hours after transfusion. This latter parameter can be assessed by measuring the half-life of red blood cells labelled with 51 chromium prior to transfusion. These parameters are, however, fairly general and easily affected by the considerable biological variability between donors, given that it is known that blood from some donors resists storage better than that from other donors39. Haemolysis is an easier parameter to monitor. Typically, between 0.2 and 0.4% of red blood cells stored in the presence of standard additive solutions are haemolysed after 5–6 weeks of storage, while pre-storage leukoreduction halves the incidence of this phenomenon40. These widely accepted and well-established parameters do not, however, reflect the profound molecular changes that affect red blood cells during their storage. A brief list of the elements of the so-called “red blood cell storage lesion” includes10: morphological changes, slowed metabolism with a decrease in the concentration of adenosine triphosphate (ATP), acidosis with a decrease in the concentration of 2,3-diphosphoglycerate (2,3-DPG), loss of function (usually transient) of cation pumps and consequent loss of intracellular potassium and accumulation of sodium within the cytoplasm, oxidative damage with changes to the structure of band 341 and lipid peroxidation, apoptotic changes with racemisation of membrane phospholipids and loss of parts of the membrane through vesiculation9. Some of these changes occur within the first few hours of storage, for example, the decrease in pH or the increases in potassium and lactate; others, however, take days or weeks10. Together, these events risk compromising the safety and efficacy of long-stored red blood cells, reducing their capacity to carry and release oxygen, promoting the release of potentially toxic intermediates (for example, free haemoglobin can act as a source of reactive oxygen species) and negatively influencing physiological rheology (through the increased capacity of the red blood cells to adhere to the endothelium42,43 or through their enhanced thrombogenic44 or pro-inflammatory45 potential). These observations at a molecular level were supported by the results of a series of clinical studies (albeit retrospective and not randomised). These studies appeared to show a relationship between the duration of storage and a proportional increase in adverse events in the transfused patients, although the data available are preliminary and the statistically more reliable studies that conform more closely with the gold standard criteria represented by evidence-based medicine are considered necessary by many4 and are, indeed, underway5.


Anaesthesia | 2017

International consensus statement on the peri-operative management of anaemia and iron deficiency

Manuel Muñoz; A. G. Acheson; M. Auerbach; M Besser; O Habler; Henrik Kehlet; Giancarlo M. Liumbruno; Sigismond Lasocki; Patrick Meybohm; R. Rao Baikady; Toby Richards; Aryeh Shander; C So-Osman; Donat R. Spahn; Andrea Klein

Despite current recommendations on the management of pre‐operative anaemia, there is no pragmatic guidance for the diagnosis and management of anaemia and iron deficiency in surgical patients. A number of experienced researchers and clinicians took part in an expert workshop and developed the following consensus statement. After presentation of our own research data and local policies and procedures, appropriate relevant literature was reviewed and discussed. We developed a series of best‐practice and evidence‐based statements to advise on patient care with respect to anaemia and iron deficiency in the peri‐operative period. These statements include: a diagnostic approach for anaemia and iron deficiency in surgical patients; identification of patients appropriate for treatment; and advice on practical management and follow‐up. We urge anaesthetists and peri‐operative physicians to embrace these recommendations, and hospital administrators to enable implementation of these concepts by allocating adequate resources.


Transfusion and Apheresis Science | 2008

World apheresis registry 2003-2007 data

Bernd Stegmayr; Jan Pták; Björn Wikström; G. Berlin; C. G. Axelsson; A. Griskevicius; Paolo Emilio Centoni; Giancarlo M. Liumbruno; Pietra Molfettini; J. Audzijoniene; K. Mokvist; B. Nilsson Sojka; Rut Norda; Folke Knutson; W. Ramlow; M. Blaha; Volker Witt; M. Evergren; J. Tomaz

OBJECTIVES Seventy-five centers from many countries have applied for a login code to the WAA apheresis registry. Fifteen centers from 7 countries have been actively entering data at the internet site from 2003 until 2007. We report on data from the registry so far. METHODS This is a web-based registry. A link is available from the WAA homepage (www.worldapheresis.org). So far data from 2013 patients (12,448 procedures) have been included. A median of 6 treatments have been performed (range 1-140). Mean age 51 years (range 1-94 years; 45% women). Seven percent of the patients were < or = 21 years and 4% were < or = 16 years. RESULTS The purpose of the apheresis procedure was therapeutic in 67% and retrieval of blood components in 33%. Main indications: neurological and hematological diseases, lipid apheresis and stemcell collection (autologous, and some allogeneic). Blood access: peripheral vessels (71%), central dialysis catheter through jugular (6.5%) or subclavian veins (6.7%), femoral vein (8%) and AV fistula (4%). ACD was used for anticoagulation in 73% of the procedures. Albumin was mainly used as replacement fluid. Adverse events (AE) were registered in 5.7% of the procedures. AE was graded as mild (2.5%), moderate (2.7%) or severe (0.5%). No death occurred due to treatment. The procedures were interrupted in 2.6%. Most frequent AEs were blood access problems (29%), tingling around the mouth (20%), hypotension (18%), and urticaria (9%). There were significant differences between the centers regarding mild and moderate AEs. Data indicate that centers using continuous infusion of calcium had fewer AEs. CONCLUSION There was a limited number of severe AEs. Centers use various standard procedures for apheresis. By learning from the experience of others the treatment quality will improve further. In the near future, an update of the registry will enable more extensive evaluation of the data.


International Journal of Stroke | 2011

The practical management of intracerebral hemorrhage associated with oral anticoagulant therapy.

Luca Masotti; Daniel Agustin Godoy; Daniela Rafanelli; Giancarlo M. Liumbruno; Nicholas Koumpouros; Giancarlo Landini; Alessandro Pampana; Roberto Cappelli; Daniela Poli; Domenico Prisco

Oral anticoagulant-associated intracerebral hemorrhage is increasing in incidence and is the most feared complication of therapy with vitamin K1 antagonists. Anticoagulant-associated intracerebral hemorrhage has a high risk of ongoing bleeding, death, or disability. The most important aspect of clinical management of anticoagulant-associated intracerebral hemorrhage is represented by urgent reversal of coagulopathy, decreasing as quickly as possible the international normalized ratio to values ≤1·4, preferably ≤1·2, together with life support and surgical therapy, when indicated. Protocols for anticoagulant-associated intracerebral hemorrhage emphasize the immediate discontinuation of anticoagulant medication and the immediate intravenous administration of vitamin K1 (mean dose: 10–20 mg), and the use of prothrombin complex concentrates (variable doses calculated estimate circulating functional prothrombin complex) or fresh-frozen plasma (15–30 ml/kg) or recombinant activated factor VII (15–120 μg/kg). Because of cost and availability, there is limited randomized evidence comparing different reversal strategies that support a specific treatment regimen. In this paper, we emphasize the growing importance of anticoagulant-associated intracerebral hemorrhage and describe options for acute coagulopathy reversal in this setting. Additionally, emphasis is placed on understanding current consensus-based guidelines for coagulopathy reversal and the challenges of determining best evidence for these treatments. On the basis of the available knowledge, inappropriate adherence to current consensus-based guidelines for coagulopathy reversal may expose the physician to medico-legal implications.


Clinical Chemistry and Laboratory Medicine | 2013

ABO blood group: old dogma, new perspectives.

Massimo Franchini; Giancarlo M. Liumbruno

Abstract Human blood group antigens are glycoproteins and glycolipids expressed on the surface of red blood cells and a variety of human tissues, including the epithelium, sensory neurons, platelets and the vascular endothelium. Accumulating evidence indicate that ABO blood type is implicated in the development of a number of human diseases, including cardiovascular and neoplastic disorders. In this review, beside its physiologic role in immunohematology and transfusion medicine, we summarize the current knowledge on the association between the ABO blood group and the risk of developing thrombotic events and cancers.


Journal of Proteomics | 2008

Transfusion medicine in the era of proteomics

Giancarlo M. Liumbruno; Gian Maria D'Amici; Giuliano Grazzini; Lello Zolla

Blood components (BCs) are highly complex mixtures of plasma proteins and cells. At present, BC and blood derivatives (BDs) quality control is mainly focused on standardized quantitative assessment, providing relatively limited information about products. Unfortunately, during the production, inactivation, and storage processes there is the risk of changes in their integrity, especially at the protein level, which could cause negative effects on transfusion. It is therefore a major challenge to identify significant alterations of these products, and, in this context, proteomics can play a potentially relevant role in transfusion medicine (TM) to assess the protein composition of blood-derived therapeutics, particularly for identifying modified proteins. It can provide comprehensive information about changes occurring during processing and storage of BCs and BDs and can be applied to assess or improve them, therefore potentially enabling a global assessment of processing, inactivation and storage methods, as well as of possible contaminants and neoantigens that may influence the immunogenic capacity of blood-derived therapeutics. Thus, proteomics could become a relevant part of quality-control process to verify the identity, purity, safety, and potency of various blood therapeutics. A more detailed understanding of the proteins found in blood and blood products, and the identification of their interactions, may also yield important information for the design of new small molecule therapeutics and also for future improvements in TM. Proteomics, together with genomics in the near future, will presumably have an impact on disease diagnosis and prognosis as well as on further advances in the production, pathogen inactivation and storage processes of blood-based therapeutics.


Vox Sanguinis | 2010

The role of antenatal immunoprophylaxis in the prevention of maternal-foetal anti-Rh(D) alloimmunisation.

Giancarlo M. Liumbruno; D'Alessandro A; Rea F; Piccinini; Liviana Catalano; Gabriele Calizzani; Simonetta Pupella; Giuliano Grazzini

The first description of haemolytic disease of the newborn (HDN) can be traced back to 1609 and was made by a French midwife, Louise Bourgeois, who, from 1600, worked at the royal court of King Henry IV and Queen Marie de Medicis1–4. In the treatise that Bourgeois wrote in 1609 she described the birth of two twins3: the first had hydrops and died immediately, while the second, initially in a better condition, rapidly became jaundiced and, after having developed neurological symptoms (kernicterus), died 3 days after being born. Hydrops foetalis and kernicterus were correctly interpreted as two aspects of the same pathology only in 19325, when Diamond described foetal erythroblastosis secondary to severe haemolysis, although the cause was still unknown. A few years later, in 1938, Ruth Darrow correctly identified the (antibody-related) pathogenesis of HDN6, although erroneously attributing foetal haemoglobin the role of the culprit antigen, which was suggested to have induced a maternal antibody response after crossing the placenta. The true pathogenesis of the disease was definitively clarified in 1940 with the discovery of the Rhesus (Rh) blood group system by Landsteiner and Wiener7 and with the subsequent identification, in 1941, by Levine8, of the Rh(D) antigen. This antigen was, in fact, identified, in D-negative mothers, as being the cause of the immunisation occurring following transplacental passage of foetal D-positive red blood cells. The subsequent passage of maternal anti-D immunoglobulin G (IgG) across the placenta into the foetal circulation was recognised as the final event able to cause the spectrum of clinical events that characterise HDN. It did not take long before the risk of immunisation could be quantified1,3: i) 16% in the case of a Rh(D)-negative mother and a Rh(D)-positive, ABO-compatible foetus; ii) 2% in the case of a Rh(D)-negative mother and a Rh(D)-positive, ABO-incompatible foetus (about 20% of the cases); iii) overall risk of immunisation: 13.2%. Before 1945, about 50% of all foetuses with HDN died of kernicterus or hydrops foetalis. Subsequently, thanks to the progress in treatment, in industrialised countries the mortality decreased to 2–3%; this mortality rate was then very considerably further reduced (100-fold) with the introduction of anti-D immunoprophylaxis to prevent maternal-foetal anti-Rh(D) alloimmunisation 9. At the beginning of the 1960s, Stern demonstrated experimentally that the administration of anti-D IgG could prevent sensitisation to the Rh(D) antigen10; in the same period, other studies clarified the mechanism of Rh iso-immunisation in pregnancy and introduced the clinical practice of passive immunisation with anti-D IgG to protect Rh(D)-negative women from sensitisation against Rh(D)-positive red blood cells11–14. The successes obtained in studies of Rh(D)-negative male volunteers formed the experimental basis for clinical trials in pregnant Rh(D)-negative women15; these trials demonstrated that post-partum immunoprophylaxis decreased the incidence of post-pregnancy anti-Rh(D) immunisation from 12–13% to 1–2%15,16. Subsequently, in 1977, it was shown that 1.8% of Rh(D)-negative women, despite post-natal prophylaxis, continued to develop anti-D antibodies because of small transplacental haemorrhages during pregnancy17,18. One year later, a Canadian study by Bowman et al. showed, in 1,357 Rh(D)-negative primagravida, that the incidence of Rh(D) alloimmunisation could be reduced to 0.1% by prophylaxis with antenatal anti-D IgG, in addition to post-partum prophylaxis19. There is currently sufficient evidence demonstrating that antenatal anti-D prophylaxis also reduces the risk of Rh(D) immunisation in the next pregnancy to below the level of 0.4%. Forty years after Zipursky and Israels first proposed the use of anti-D IgG to reduce the incidence of Rh alloimmunisation in pregnancy14, immunoprophylaxis has drastically reduced the cases of Rh-induced HDN; nevertheless, this pathology continues to be relevant in 0.4 of 1,000 births (0.04%)20, for various reasons21: i) the possible occurrence of anti-D immunisation during the pregnancy (which occurs in about 1% of Rh(D)-negative women carrying a Rh(D)-positive foetus22); ii) the lack of efficacy of immunoprophylaxis because of the administration of an insufficient dose of anti-D IgG that is not congruent with the volume of the foetal-maternal haemorrhage; iii) immunoprophylaxis not administered; iv) possible errors in typing the pregnant or puerperal woman or the neonate; v) possible errors in transfusion therapy in women of child-bearing age.


Blood Transfusion | 2015

Human Parvovirus B19 and blood product safety: a tale of twenty years of improvements

Giuseppe Marano; Stefania Vaglio; Simonetta Pupella; Giuseppina Facco; Gabriele Calizzani; Fabio Candura; Giancarlo M. Liumbruno; Giuliano Grazzini

The establishment of systems to ensure a safe and sufficient supply of blood and blood products for all patients requiring transfusion is a core issue of every blood programme. A spectrum of blood infectious agents is transmitted through transfusion of infected blood donated by apparently healthy and asymptomatic blood donors. Recent emerging-infectious-disease threats include West Nile virus1,2, chikungunya3, babesia4, dengue5, hepatitis E virus6, and variant of Creutzfeldt-Jakob disease7. Parvovirus B19 (B19V), long known to be the causative agent of erythema infectiosum (fifth disease), is not a newly emerging agent. However, it deserves discussion because it may be present in blood and in plasma products, can circulate at extraordinarily high titres, can infect recipients, and, in some cases, can cause severe disease8. Its potentially severe pathological effects have become more apparent in the past decade with the widespread use of (pooled) plasma-derived medicinal products and are the main reason for the uneasy relationship between transfusion medicine specialists and B19V9. The aim of this review is to analyse the role played by this virus in compromising safety in transfusion medicine and the progressive measures to reduce the risks associated with the virus.


Blood Transfusion | 2015

Hepatitis E: an old infection with new implications.

Giuseppe Marano; Stefania Vaglio; Simonetta Pupella; Giuseppina Facco; Maria Bianchi; Gabriele Calizzani; Fabio Candura; Liviana Catalano; Blandina Farina; Monica Lanzoni; Vanessa Piccinini; Giancarlo M. Liumbruno; Giuliano Grazzini

The availability of safe blood and blood products is an important public health issue. Improvements in donor screening and testing, pathogen inactivation1 and removal methods, the use of serological tests with greater diagnostic efficacy and the introduction of nucleic acid testing (NAT) have resulted in a substantial drop in transfusion-transmitted infections over the last two decades2. Nonetheless, blood supplies remain vulnerable to emerging and re-emerging infections. In recent years, numerous infectious agents found worldwide have been identified or reconsidered as potential threats to blood supplies3–5. Hepatitis E virus (HEV) has long been considered an enterically transmitted virus causing self-limiting acute viral hepatitis. The disease is endemic in many developing countries, but in recent years an increasing number of autochthonous and sporadic HEV infections have been described in developed countries6. This virus usually causes an acute self-limiting hepatitis, but in some cases fulminant hepatic failure resulting in morbidity and mortality may occur, especially in at-risk groups such as the elderly, pregnant women and patients with pre-existing liver disease or those who are immunocompromised. Furthermore, recent seroprevalence studies are questioning the concept of the low circulation of HEV in developed countries7. This narrative review aims at providing a comprehensive view of HEV and its possible “role” in transfusion medicine.


Blood Transfusion | 2016

Recommendations for the implementation of a Patient Blood Management programme. Application to elective major orthopaedic surgery in adults

Stefania Vaglio; Domenico Prisco; Gianni Biancofiore; Daniela Rafanelli; Paola Antonioli; Michele Lisanti; Lorenzo Andreani; Leonardo Basso; Claudio Velati; Giuliano Grazzini; Giancarlo M. Liumbruno

Patient Blood Management (PBM) is a holistic approach to the management of blood as a resource for each, single patient; it is a multimodal strategy that is implemented through the use of a set of techniques that can be applied in individual cases. Indeed, the overall outcome resulting from the implementation of PBM cannot be fully appreciated and explained simply by summing the effects of the single strategies and techniques used, since these can only produce the expected optimal outcome if used in combination1. PBM is, therefore, a patient-centred, multiprofessional, multidisciplinary and multimodal approach to the optimal management of anaemia and haemostasis (also during surgery), to limiting allogeneic transfusion needs in the peri-operative period, and to appropriate use of blood components and, when relevant, plasma-derived medicinal products2. The concept of PBM is not centred on a specific pathology or procedure, nor on a specific discipline or sector of medicine, but is aimed at managing a resource, “the patient’s blood”, shifting attention from the blood component to the patient who, therefore, acquires a central and pre-eminent role3,4. PBM combines the dual purposes of improving the outcomes of patients and reducing costs, being based on the patient rather than on allogeneic blood as the resource. For this reason, PBM goes beyond the concept of appropriate use of blood components and plasma-derived medicinal products, since its purpose is to avoid or significantly reduce their use, managing, in good time, all the modifiable risk factors that can lead to a transfusion being required5. These aims can be achieved through the so-called “three pillars of PBM” (Table I)5, which are crucial for making the paradigmatic shift that characterises the innovative, patient-centred approach: (i) optimising the patient’s erythropoiesis; (ii) minimising bleeding; and (iii) optimising and exploiting an individual’s physiological reserve to tolerate anaemia5. Each of these three key points is a strategic response to clinical circumstances that can cause adverse outcomes and necessitate the use of allogeneic transfusion therapy, namely anaemia, blood loss and hypoxia, respectively. Table I The three pillars of Patient Blood Management (modified from Hofmann A et al.5). PBM is, therefore, intended to guarantee all patients a series of personalised programmes, based on surgical requirements and the characteristics of the patients themselves, with the dual purposes of using allogeneic transfusion support appropriately and reducing the need for this resource. For this reason, PBM requires multidisciplinary and multimodal strategies to systematically identify, evaluate and manage anaemia (boosting, if necessary, individual physiological reserves) and to avoid or minimise blood losses. It seems necessary to produce specific national standards. In fact, in the USA, PBM is the object of attention from the Association for Advancing Transfusions and Cellular Therapies (formerly known as the American Association of Blood Banks - AABB) which recently published the first edition of “Standards for a Patient Blood Management Program” precisely with the aim of supplying healthcare structures with solid elements for the standardisation of procedures and activities for implementing and/or optimising a PBM programme. The Society for the Advancement of Blood Management (SABM), also in the USA, has published a second edition of “Administrative and Clinical Standards for Patient Blood Management Programs”6 and the Joint Commission has published seven parameters for measuring the performance of healthcare structures in the field of PBM7.

Collaboration


Dive into the Giancarlo M. Liumbruno's collaboration.

Top Co-Authors

Avatar

Giuliano Grazzini

Istituto Superiore di Sanità

View shared research outputs
Top Co-Authors

Avatar

Stefania Vaglio

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar

Simonetta Pupella

Istituto Superiore di Sanità

View shared research outputs
Top Co-Authors

Avatar

Giuseppe Marano

Istituto Superiore di Sanità

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gabriele Calizzani

Istituto Superiore di Sanità

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. Tomaz

University of Coimbra

View shared research outputs
Top Co-Authors

Avatar

Carlo Mengoli

Istituto Superiore di Sanità

View shared research outputs
Top Co-Authors

Avatar

Giuseppina Facco

Istituto Superiore di Sanità

View shared research outputs
Researchain Logo
Decentralizing Knowledge