Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Enrico Mombelli.
Environmental Science & Technology | 2014
Alexandre R.R. Péry; James Devillers; Céline Brochot; Enrico Mombelli; Olivier Palluel; Benjamin Piccini; François Brion; Rémy Beaudouin
Zebrafish (Danio rerio) is a widely used model for toxicological studies, in particular those related to investigations on endocrine disruption. The development and regulatory use of in vivo and in vitro tests based on this species can be enhanced by toxicokinetic modeling. For this reason, we propose a physiologically based toxicokinetic (PBTK) model for zebrafish describing the uptake and disposition of organic chemicals. The model is based on literature data on zebrafish, other cyprinidae and other fish families, new experimental physiological information (volumes, lipids and water contents) obtained from zebrafish, and chemical-specific parameters predicted by generic models. The relevance of available models predicting the latter parameters was evaluated with respect to gill uptake and partition coefficients in zebrafish. This evaluation benefited from the fact that the influence of confounding factors such as body weight and temperature on ventilation rate was included in our model. The predictions for six chemicals (65 data points) yielded by our PBTK model were compared to available toxicokinetics data for zebrafish and 88% of them were within a factor of 5 of the corresponding experimental values. Sensitivity analysis highlighted that the 1-octanol/water partition coefficient, the metabolism rate, and all the parameters that enable the prediction of assimilation efficiency and partitioning of chemicals need to be precisely determined in order to allow an effective toxicokinetic modeling.
Science of The Total Environment | 2011
Cleo Tebby; Enrico Mombelli; Pascal Pandard; Alexandre R.R. Péry
The European regulation on chemicals (REACh) places emphasis on reduction of systematic toxicity testing, thus fostering development of alternative methods. Consequently, we analysed acute toxicity data gathered by the Japanese Ministry of Environment for three species belonging to three different trophic levels (i.e., Pseudokirchneriella subcapitata 72-hour EC50, Daphnia magna 48-hour EC50 and Oryzias latipes 96-hour LC50). This paper investigates the relationships between the chemical structure and both the toxicity of the chemicals and the cross-species differences in sensitivity. The physicochemical properties of the chemicals were represented by the categories they belonged to in several widely-used categorisation schemes implemented by the freely available OECD (Q)SAR Toolbox and by quantitative molecular descriptors using DRAGON software. The outputs of these software products were analysed and compared in terms of quality of prediction and biological interpretation. Amongst the categorisations implemented by the OECD Toolbox, those focussing on bioaccumulation or biotransformation appeared to be the most interesting in terms of environmental prediction on a whole set of chemicals, in particular as the predicted biotransformation half-life is strongly dependent on hydrophobicity. In predicting toxicity towards each species, simple linear regression on logP performed better than PLS regression of toxicity on a very large set of molecular descriptors. However, the predictions based on the interspecies correlations performed better than the QSAR predictions. The results in terms of cross-species comparisons encourage the use of test strategies focussing on reducing the number of tests on fish.
Science of The Total Environment | 2013
Alexandre R.R. Péry; Gerrit Schüürmann; Philippe Ciffroy; Michael Faust; Thomas Backhaus; Lothar Aicher; Enrico Mombelli; Cleo Tebby; Mark T. D. Cronin; Sylvie Tissot; Sandrine Andres; Jean-Marc Brignon; Lynn J. Frewer; S. Georgiou; Konstadinos Mattas; Jean-Christophe Vergnaud; Willie J.G.M. Peijnenburg; Ettore Capri; Alexandru Vasile Marchis; Martin F. Wilks
For more than a decade, the integration of human and environmental risk assessment (RA) has become an attractive vision. At the same time, existing European regulations of chemical substances such as REACH (EC Regulation No. 1907/2006), the Plant Protection Products Regulation (EC regulation 1107/2009) and Biocide Regulation (EC Regulation 528/2012) continue to ask for sector-specific RAs, each of which have their individual information requirements regarding exposure and hazard data, and also use different methodologies for the ultimate risk quantification. In response to this difference between the vision for integration and the current scientific and regulatory practice, the present paper outlines five medium-term opportunities for integrating human and environmental RA, followed by detailed discussions of the associated major components and their state of the art. Current hazard assessment approaches are analyzed in terms of data availability and quality, and covering non-test tools, the integrated testing strategy (ITS) approach, the adverse outcome pathway (AOP) concept, methods for assessing uncertainty, and the issue of explicitly treating mixture toxicity. With respect to exposure, opportunities for integrating exposure assessment are discussed, taking into account the uncertainty, standardization and validation of exposure modeling as well as the availability of exposure data. A further focus is on ways to complement RA by a socio-economic assessment (SEA) in order to better inform about risk management options. In this way, the present analysis, developed as part of the EU FP7 project HEROIC, may contribute to paving the way for integrating, where useful and possible, human and environmental RA in a manner suitable for its coupling with SEA.
Science of The Total Environment | 2011
Adina Henegar; Enrico Mombelli; Pascal Pandard; Alexandre R.R. Péry
Since REACh applies in all of EU, special emphasis has been put on the reduction of systematic ecotoxicity testing. In this context, it is important to extract a maximum of information from existing ecotoxicity databases in order to propose alternative methods aimed at replacing and reducing experimental testing. Consequently, we analyzed a database of new chemicals registered in France and Europe during the last twenty years reporting aquatic ecotoxicity data with respect to three trophic levels (i.e., Algae EC50 72 h, Daphnia EC50 48 h and Fish LC50 96 h). In order to ensure the relevance of the comparison between these three experimental tests, we performed a stringent data selection based on the pertinence and quality of available ecotoxicological information. At the end of this selection, less than 5% of the initial number of chemicals was retained for subsequent analysis. Such an analysis showed that fish was the least sensitive trophic level, whereas Daphnia had the highest sensitivity. Moreover, thanks to an analysis of the relative sensitivity of trophic levels, it was possible to establish that respective correction factors of 50 and 10 would be necessary if only one or two test values were available. From a physicochemical point of view, it was possible to characterize two significant correlations relating the sensitivity of the aforementioned trophic levels with the chemical structure of the retained substances. This analysis showed that algae displayed a higher sensitivity towards chemicals containing acid fragments whereas fish presented a higher sensitivity towards chemicals containing aromatic ether fragments. Overall, our work suggests that statistical analysis of historical data combined with data yielded by the REACh regulation should permit the derivation of robust safety factors, testing strategies and mathematical models. These alternative methods, in turn, could allow a replacement and reduction of ecotoxicological testing.
Regulatory Toxicology and Pharmacology | 2010
Alexandre R.R. Péry; Sophie Desmots; Enrico Mombelli
The design of toxicological testing strategies aimed at identifying the toxic effects of chemicals without (or with a minimal) recourse to animal experimentation is an important issue for toxicological regulations and for industrial decision-making. This article describes an original approach which enables the design of substance-tailored testing strategies with a specified performance in terms of false-positive and false-negative rates. The outcome of toxicological testing is simulated in a different way than previously published articles on the topic. Indeed, toxicological outcomes are simulated not only as a function of the performance of toxicological tests but also as a function of the physico-chemical properties of chemicals. The required inputs for our approach are QSAR predictions for the LOAELs of the toxicological effect of interest and statistical distributions describing the relationship existing between in vivo LOAEL values and results from in vitro tests. Our methodology is able to correctly predict the performance of testing strategies designed to analyze the teratogenic effects of two chemicals: di(2-ethylhexyl)phthalate and Indomethacin. The proposed decision-support methodology can be adapted to any toxicological context as long as a statistical comparison between in vitro and in vivo results is possible and QSAR models for the toxicological effect of interest can be developed.
Molecular Informatics | 2012
Cleo Tebby; Enrico Mombelli
The assessment of uncertainty attached to individual predictions is now a priority for sound decision‐making in risk assessment. QSAR predictive uncertainty is affected by a variety of factors related to the quality of the training set data, the adopted statistical models, and the distance between the query chemical and the training set. We developed a method to quantify uncertainty associated with individual linear QSAR predictions that integrates both model and experimental error uncertainty and that defines an applicability domain based on the density of training set data. Our method is based on chemical spaces defined by latent variables identified by Partial Least Squares (PLS) regressions. The method provides a kernel regression estimate of the activity of interest as well as a measure of predictive uncertainty based on a mathematical estimation of the domain of applicability and on local propagation of uncertainty associated with training set data.
Environnement Risques & Sante | 2017
Elias Zgheib; C. Béchaux; Amélie Crépet; Enrico Mombelli; Frédéric Y. Bois
Toxicology is changing its experimental approaches from animal testing to less expensive, more ethical and relevant methods. From the beginning of this century, various regulations and research programs on both sides of the Atlantic have pushed and contributed to this change. Modern toxicology relies on two main components: in vitro testing and in silico analyses. Toxicology has also entered a world of “big data” production, switching from a low-throughput to a high-throughput mode of screening. Complementary to the assessment of toxicological impact, a large effort has also been made to evaluate human exposure to chemicals: new human and field surveys, analytical measurements, computational capacities, and the use of mathematical modeling have open new possibilities for exposure assessment. Accounting for several sources and routes of exposures, estimating combined exposure to mixtures, integrating exposure variability, and simulating long-term exposure are new challenges on their way to be solved. In addition, biomonitoring data, internal exposure biomarkers, and toxicokinetics are all adding to the list of tools and techniques helping to link the pieces of the yet incomplete puzzle of high-throughput risk assessment. Yet, high-throughput applications in toxicology have been criticized, for their inadequate representation of the biological interactions at the organism level, for the experimental noise they suffer from, for the complexity of the in vivo to in vitro extrapolation and for their yet undefined validation protocols. We propose here a brief panorama of those developments.
Environnement Risques & Sante | 2014
Enrico Mombelli; Alexandre R.R. Péry; Isabelle Fabre; Fanny Boislève; Groupe de travail recherche Francopa; Marc Pallardy
Les methodes predictives par analogie structurale representent une alternative reconnue a l’experimentation animale en (eco)toxicologie. Parmi ces methodes, les predictions par « lecture croisee » sont particulierement interessantes car elles peuvent etre argumentees au cas par cas sur la base d’un avis d’expert sans avoir recours a des modeles QSAR (Relations quantitatives structure a activite) dont l’interpretation necessite des connaissances specifiques en modelisation moleculaire et statistique.De plus, les predictions par lecture croisee peuvent etre realisees a partir d’un nombre reduit de substances chimiques dont le profil toxicologique a ete determine experimentalement. Par consequent, cette approche peut etre appliquee a des effets toxicologiques pour lesquels il n’y a pas de modele QSAR disponible, mais seulement quelques substances qui sont considerees comme etant similaires a celle d’interet. Cependant, les predictions par lecture croisee ne sont pas encadrees par des principes de validation comme ceux qui assurent l’application correcte et l’interpretation des predictions elaborees par les modeles QSAR.Suivant ces constats, le groupe de travail recherche de la plateforme francaise FRANCOPA dediee au developpement, a la validation et a la diffusion de methodes alternatives en experimentation animale a conduit un exercice qui se propose d’analyser la pertinence des approches par lecture croisee pour la prediction du potentiel de sensibilisation cutanee des substances chimiques. Les resultats de l’exercice montrent que, grâce a des connaissances mecanistiques et a des criteres empiriques de similarite, il est possible de selectionner des substances chimiques pertinentes permettant la mise en place d’une aide a la prediction par analogie structurale.
Molecular Informatics | 2013
Cleo Tebby; Enrico Mombelli
Quantitative Structure‐Activity Relationship (QSAR) models are increasingly used in hazard and risk assessment. Even when models with linear relationships between activity and a small number of descriptors are built and validated regarding predictivity and statistical assumptions, similar structures can exhibit large differences in activity known as similarity paradoxes or activity cliffs. In order to reduce the impact that similarity paradoxes can have on predictions we have devised a statistical method based on Nadaraya‐Watson kernel regression. According to our method, activity cliffs filter out contributions of neighbouring chemicals especially along the cliff axis. Our method decreases density‐based certainty in particular for chemicals with strong prediction errors and the implementation of Structure‐Activity Landscape Index (SALI) curves shows that our method improves the prediction of activity cliff ranks. We also provide useful indications on the density‐based applicability domain and the reliability of individual predictions.
Bulletin of Environmental Contamination and Toxicology | 2011
Enrico Mombelli; Alexandre R.R. Péry
Chronic toxicity data for Daphnia magna are information requirements in the context of regulations on chemical safety. This paper proposes a linear model for the prediction of chemically-induced effects on the reproductive output of D. magna. This model is based on data retrieved from the Japanese Ministry of Environment database and it predicts chronic effects as a function of acute toxicity data. The proposed model proved to be able to predict chronic toxicities for chemicals not used in the training set. Our results suggest that experiments involving chronic exposure to chemicals could be reduced thanks to the proposed model.
Collaboration
Dive into the Enrico Mombelli's collaboration.
Agence française de sécurité sanitaire des produits de santé
View shared research outputsAgence française de sécurité sanitaire des produits de santé
View shared research outputs