Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Costanza Rovida is active.

Publication


Featured researches published by Costanza Rovida.


Nature | 2009

Chemical regulators have overreached.

Thomas Hartung; Costanza Rovida

The costs — both in animal lives and euros — of the European REACH legislation on chemical testing are escalating. Thomas Hartung and Costanza Rovida argue for a suspension of certain toxicity tests.


ALTEX-Alternatives to Animal Experimentation | 2012

A roadmap for the development of alternative (non-animal) methods for systemic toxicity testing - t4 report

David A. Basketter; Harvey J. Clewell; Ian Kimber; Annamaria Rossi; Bas J. Blaauboer; Robert Burrier; Mardas Daneshian; Chantra Eskes; Alan M. Goldberg; Nina Hasiwa; Sebastian Hoffmann; Joanna Jaworska; Thomas B. Knudsen; Robert Landsiedel; Marcel Leist; Paul A. Locke; Gavin Maxwell; James M. McKim; Emily McVey; Gladys Ouédraogo; Grace Patlewicz; Olavi Pelkonen; Erwin Ludo Roggen; Costanza Rovida; Irmela Ruhdel; Michael Schwarz; Andreas Schepky; Greet Schoeters; Nigel Skinner; Kerstin Trentz

Systemic toxicity testing forms the cornerstone for the safety evaluation of substances. Pressures to move from traditional animal models to novel technologies arise from various concerns, including: the need to evaluate large numbers of previously untested chemicals and new products (such as nanoparticles or cell therapies), the limited predictivity of traditional tests for human health effects, duration and costs of current approaches, and animal welfare considerations. The latter holds especially true in the context of the scheduled 2013 marketing ban on cosmetic ingredients tested for systemic toxicity. Based on a major analysis of the status of alternative methods (Adler et al., 2011) and its independent review (Hartung et al., 2011), the present report proposes a roadmap for how to overcome the acknowledged scientific gaps for the full replacement of systemic toxicity testing using animals. Five whitepapers were commissioned addressing toxicokinetics, skin sensitization, repeated-dose toxicity, carcinogenicity, and reproductive toxicity testing. An expert workshop of 35 participants from Europe and the US discussed and refined these whitepapers, which were subsequently compiled to form the present report. By prioritizing the many options to move the field forward, the expert group hopes to advance regulatory science.


ALTEX-Alternatives to Animal Experimentation | 2016

Analysis of publically available skin sensitization data from REACH registrations 2008-2014

Thomas Luechtefeld; Alexandra Maertens; Daniel P. Russo; Costanza Rovida; Hao Zhu; Thomas Hartung

Summary The public data on skin sensitization from REACH registrations already included 19,111 studies on skin sensitization in December 2014, making it the largest repository of such data so far (1,470 substances with mouse LLNA, 2,787 with GPMT, 762 with both in vivo and in vitro and 139 with only in vitro data). 21% were classified as sensitizers. The extracted skin sensitization data was analyzed to identify relationships in skin sensitization guidelines, visualize structural relationships of sensitizers, and build models to predict sensitization. A chemical with molecular weight > 500 Da is generally considered non-sensitizing owing to low bioavailability, but 49 sensitizing chemicals with a molecular weight > 500 Da were found. A chemical similarity map was produced using PubChem’s 2D Tanimoto similarity metric and Gephi force layout visualization. Nine clusters of chemicals were identified by Blondel’s module recognition algorithm revealing wide module-dependent variation. Approximately 31% of mapped chemicals are Michael’s acceptors but alone this does not imply skin sensitization. A simple sensitization model using molecular weight and five ToxTree structural alerts showed a balanced accuracy of 65.8% (specificity 80.4%, sensitivity 51.4%), demonstrating that structural alerts have information value. A simple variant of k-nearest neighbors outperformed the ToxTree approach even at 75% similarity threshold (82% balanced accuracy at 0.95 threshold). At higher thresholds, the balanced accuracy increased. Lower similarity thresholds decrease sensitivity faster than specificity. This analysis scopes the landscape of chemical skin sensitization, demonstrating the value of large public datasets for health hazard prediction.


ALTEX-Alternatives to Animal Experimentation | 2016

Analysis of Draize eye irritation testing and its prediction by mining publicly available 2008-2014 REACH data

Thomas Luechtefeld; Alexandra Maertens; Daniel P. Russo; Costanza Rovida; Hao Zhu; Thomas Hartung

Summary Public data from ECHA online dossiers on 9,801 substances encompassing 326,749 experimental key studies and additional information on classification and labeling were made computable. Eye irritation hazard, for which the rabbit Draize eye test still represents the reference method, was analyzed. Dossiers contained 9,782 Draize eye studies on 3,420 unique substances, indicating frequent retesting of substances. This allowed assessment of the test’s reproducibility based on all substances tested more than once. There was a 10% chance of a non-irritant evaluation after a prior severe-irritant result according to UN GHS classification criteria. The most reproducible outcomes were the results negative (94% reproducible) and severe eye irritant (73% reproducible). To evaluate whether other GHS categorizations predict eye irritation, we built a dataset of 5,629 substances (1,931 “irritant” and 3,698 “non-irritant”). The two best decision trees with up to three other GHS classifications resulted in balanced accuracies of 68% and 73%, i.e., in the rank order of the Draize rabbit eye test itself, but both use inhalation toxicity data (“May cause respiratory irritation”), which is not typically available. Next, a dataset of 929 substances with at least one Draize study was mapped to PubChem to compute chemical similarity using 2D conformational fingerprints and Tanimoto similarity. Using a minimum similarity of 0.7 and simple classification by the closest chemical neighbor resulted in balanced accuracy from 73% over 737 substances to 100% at a threshold of 0.975 over 41 substances. This represents a strong support of read-across and (Q)SAR approaches in this area.


ALTEX-Alternatives to Animal Experimentation | 2016

Global analysis of publicly available safety data for 9,801 substances registered under REACH from 2008-2014

Thomas Luechtefeld; Alexandra Maertens; Daniel P. Russo; Costanza Rovida; Hao Zhu; Thomas Hartung

Summary The European Chemicals Agency (ECHA) warehouses the largest public dataset of in vivo and in vitro toxicity tests. In December 2014 this data was converted into a structured, machine readable and searchable database using natural language processing. It contains data for 9,801 unique substances, 3,609 unique study descriptions and 816,048 study documents. This allows exploring toxicological data on a scale far larger than previously possible. Substance similarity analysis was used to determine clustering of substances for hazards by mapping to PubChem. Similarity was measured using PubChem 2D conformational substructure fingerprints, which were compared via the Tanimoto metric. Following K-Core filtration, the Blondel et al. (2008) module recognition algorithm was used to identify chemical modules showing clusters of substances in use within the chemical universe. The Global Harmonized System of Classification and Labelling provides a valuable information source for hazard analysis. The most prevalent hazards are H317 “May cause an allergic skin reaction” with 20% and H318 “Causes serious eye damage” with 17% positive substances. Such prevalences obtained for all hazards here are key for the design of integrated testing strategies. The data allowed estimation of animal use. The database covers about 20% of substances in the high-throughput biological assay database Tox21 (1,737 substances) and has a 917 substance overlap with the Comparative Toxicogenomics Database (~7% of CTD). The biological data available in these datasets combined with ECHA in vivo endpoints have enormous modeling potential. A case is made that REACH should systematically open regulatory data for research purposes.


ALTEX-Alternatives to Animal Experimentation | 2016

Analysis of public oral toxicity data from REACH registrations 2008-2014

Thomas Luechtefeld; Alexandra Maertens; Daniel P. Russo; Costanza Rovida; Hao Zhu; Thomas Hartung

Summary The European Chemicals Agency, ECHA, made available a total of 13,832 oral toxicity studies for 8,568 substances up to December 2014. 75% of studies were from the retired OECD Test Guideline 401 (11% TG 420, 11% TG 423 and 1.5% TG 425). Concordance across guidelines, evaluated by comparing LD50 values ≥ 2,000 or < 2,000 mg/ kg bodyweight from chemicals tested multiple times between different guidelines, was at least 75% and for their own repetition more than 90%. In 2009, Bulgheroni et al. created a simple model for predicting acute oral toxicity using no observed adverse effect levels (NOAEL) from 28-day repeated dose toxicity studies in rats. This was reproduced here for 1,625 substances. In 2014, Taylor et al. suggested no added value of the 90-day repeated dose oral toxicity test given the availability of a low 28-day study with some constraints. We confirm that the 28-day NOAEL is predictive (albeit imperfectly) of 90-day NOAELs, however, the suggested constraints did not affect predictivity. 1,059 substances with acute oral toxicity data (268 positives, 791 negatives, all Klimisch score 1) were used for modeling: The Chemical Development Kit was used to generate 27 molecular descriptors and a similarity-informed multilayer perceptron showing 71% sensitivity and 72% specificity. Additionally, the k-nearest neighbors (KNN) algorithm indicated that similarity-based approaches alone may be poor predictors of acute oral toxicity, but can be used to inform the multilayer perceptron model, where this was the feature with the highest information value.


ALTEX-Alternatives to Animal Experimentation | 2008

Consensus Report on the Future of Animal-Free Systemic Toxicity Testing

Marcel Leist; Nina Hasiwa; Costanza Rovida; Mardas Daneshian; David A. Basketter; Ian Kimber; Harvey J. Clewell; Tilman Gocht; Alan M. Goldberg; Francois Busquet; Anna Rossi; Michael Schwarz; Martin L. Stephens; Rob Taalman; Thomas B. Knudsen; James M. McKim; Georgina Harris; David Pamies; Thomas Hartung

Since March 2013, animal use for cosmetics testing for the European market has been banned. This requires a renewed view on risk assessment in this field. However, in other fields as well, traditional animal experimentation does not always satisfy requirements in safety testing, as the need for human-relevant information is ever increasing. A general strategy for animal-free test approaches was outlined by the US National Research Council`s vision document for Toxicity Testing in the 21st Century in 2007. It is now possible to provide a more defined roadmap on how to implement this vision for the four principal areas of systemic toxicity evaluation: repeat dose organ toxicity, carcinogenicity, reproductive toxicity and allergy induction (skin sensitization), as well as for the evaluation of toxicant metabolism (toxicokinetics) (Fig. 1). CAAT-Europe assembled experts from Europe, America and Asia to design a scientific roadmap for future risk assessment approaches and the outcome was then further discussed and refined in two consensus meetings with over 200 stakeholders. The key recommendations include: focusing on improving existing methods rather than favoring de novo design; combining hazard testing with toxicokinetics predictions; developing integrated test strategies; incorporating new high content endpoints to classical assays; evolving test validation procedures; promoting collaboration and data-sharing of different industrial sectors; integrating new disciplines, such as systems biology and high throughput screening; and involving regulators early on in the test development process. A focus on data quality, combined with increased attention to the scientific background of a test method, will be important drivers. Information from each test system should be mapped along adverse outcome pathways. Finally, quantitative information on all factors and key events will be fed into systems biology models that allow a probabilistic risk assessment with flexible adaptation to exposure scenarios and individual risk factors.


ALTEX-Alternatives to Animal Experimentation | 2015

Toxicity testing in the 21st century beyond environmental chemicals

Costanza Rovida; Shoji Asakura; Mardas Daneshian; Hana Hofman-Huether; Marcel Leist; Leo Meunier; David M. Reif; Anna Rossi; Markus Schmutz; Jean Pierre Valentin; Joanne Zurlo; Thomas Hartung

Summary After the publication of the report titled Toxicity Testing in the 21st Century – A Vision and a Strategy, many initiatives started to foster a major paradigm shift for toxicity testing – from apical endpoints in animal-based tests to mechanistic endpoints through delineation of pathways of toxicity (PoT) in human cell based systems. The US EPA has funded an important project to develop new high throughput technologies based on human cell based in vitro technologies. These methods are currently being incorporated into the chemical risk assessment process. In the pharmaceutical industry, the efficacy and toxicity of new drugs are evaluated during preclinical investigations that include drug metabolism, pharmacokinetics, pharmacodynamics and safety toxicology studies. The results of these studies are analyzed and extrapolated to predict efficacy and potential adverse effects in humans. However, due to the high failure rate of drugs during the clinical phases, a new approach for a more predictive assessment of drugs both in terms of efficacy and adverse effects is getting urgent. The food industry faces the challenge of assessing novel foods and food ingredients for the general population, while using animal safety testing for extrapolation purposes is often of limited relevance. The question is whether the latest paradigm shift proposed by the Tox21c report for chemicals may provide a useful tool to improve the risk assessment approach also for drugs and food ingredients.


ALTEX-Alternatives to Animal Experimentation | 2013

Advanced tests for skin and respiratory sensitization assessment.

Costanza Rovida; Stefan F. Martin; Manon Vivier; Hans Ulrich Weltzien; Erwin Ludo Roggen

Sens-it-iv is an FP6 Integrated Project that finished in March 2011 after 66 months of activity, thanks to 12 million € of funding. The ultimate goal of the Sens-it-iv project was the development of a set of in vitro methods for the assessment of the skin and respiratory sensitization potential of chemicals and proteins. The level of development was intended to be at the point to enter the pre-validation phase. At the end of the project it can be concluded that the goal has been largely accomplished. Several advanced methods were evaluated extensively, and for some of them a detailed Standard Operating Procedure (SOP) was established. Other, less advanced methods also contributed to our understanding of the mechanisms driving sensitization. The present contribution, which has been prepared with the support of CAAT-Europe, represents a short summary of what was discussed during the 3-day end congress of the Sens-it-iv project in Brussels. It presents a list of methods that are ready for skin sensitization hazard assessment. Potency evaluation and the possibility of distinguishing skin from respiratory sensitizers are also well advanced.


ALTEX-Alternatives to Animal Experimentation | 2013

A Roadmap for Hazard Monitoring and Risk Assessment of Marine Biotoxins on the Basis of Chemical and Biological Test Systems

Mardas Daneshian; Luis M. Botana; Marie Yasmine Dechraoui Bottein; Gemma Buckland; Mònica Campàs; Ngaire Dennison; Robert W. Dickey; Jorge Diogène; Valérie Fessard; Thomas Hartung; Andrew R. Humpage; Marcel Leist; Jordi Molgó; Michael A. Quilliam; Costanza Rovida; Benjamin A. Suarez-Isla; Aurelia Tubaro; Kristina Wagner; Otmar Zoller; Daniel R. Dietrich

Aquatic food accounts for over 40% of global animal food products, and the potential contamination with toxins of algal origin--marine biotoxins--poses a health threat for consumers. The gold standards to assess toxins in aquatic food have traditionally been in vivo methods, i.e., the mouse as well as the rat bioassay. Besides ethical concerns, there is also a need for more reliable test methods because of low inter-species comparability, high intra-species variability, the high number of false positive and negative results as well as questionable extrapolation of quantitative risk to humans. For this reason, a transatlantic group of experts in the field of marine biotoxins was convened from academia and regulatory safety authorities to discuss future approaches to marine biotoxin testing. In this report they provide a background on the toxin classes, on their chemical characterization, the epidemiology, on risk assessment and management, as well as on their assumed mode of action. Most importantly, physiological functional assays such as in vitro bioassays and also analytical techniques, e.g., liquid chromatography coupled mass spectrometry (LC-MS), as substitutes for the rodent bioassay are reviewed. This forms the basis for recommendations on methodologies for hazard monitoring and risk assessment, establishment of causality of intoxications in human cases, a roadmap for research and development of human-relevant functional assays, as well as new approaches for a consumer directed safety concept.

Collaboration


Dive into the Costanza Rovida's collaboration.

Top Co-Authors

Avatar

Thomas Hartung

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Ian Kimber

University of Manchester

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge