Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mounir Bouhifd is active.

Publication


Featured researches published by Mounir Bouhifd.


ALTEX-Alternatives to Animal Experimentation | 2016

Toward good read-across practice (GRAP) guidance

Nicholas Ball; Mark T. D. Cronin; Jie Shen; Karen Blackburn; Ewan D. Booth; Mounir Bouhifd; Elizabeth L.R. Donley; Laura A. Egnash; Charles Hastings; D.R. Juberg; Andre Kleensang; Nicole Kleinstreuer; E.D. Kroese; A.C. Lee; Thomas Luechtefeld; Alexandra Maertens; S. Marty; Jorge M. Naciff; Jessica A. Palmer; David Pamies; M. Penman; Andrea-Nicole Richarz; Daniel P. Russo; Sharon B. Stuard; G. Patlewicz; B. van Ravenzwaay; Shengde Wu; Hao Zhu; Thomas Hartung

Summary Grouping of substances and utilizing read-across of data within those groups represents an important data gap filling technique for chemical safety assessments. Categories/analogue groups are typically developed based on structural similarity and, increasingly often, also on mechanistic (biological) similarity. While read-across can play a key role in complying with legislation such as the European REACH regulation, the lack of consensus regarding the extent and type of evidence necessary to support it often hampers its successful application and acceptance by regulatory authorities. Despite a potentially broad user community, expertise is still concentrated across a handful of organizations and individuals. In order to facilitate the effective use of read-across, this document presents the state of the art, summarizes insights learned from reviewing ECHA published decisions regarding the relative successes/pitfalls surrounding read-across under REACH, and compiles the relevant activities and guidance documents. Special emphasis is given to the available existing tools and approaches, an analysis of ECHAs published final decisions associated with all levels of compliance checks and testing proposals, the consideration and expression of uncertainty, the use of biological support data, and the impact of the ECHA Read-Across Assessment Framework (RAAF) published in 2015.


ALTEX-Alternatives to Animal Experimentation | 2016

Supporting read-across using biological data

Hao Zhu; Mounir Bouhifd; Elizabeth L.R. Donley; Laura A. Egnash; Nicole Kleinstreuer; E. Dinant Kroese; Zhichao Liu; Thomas Luechtefeld; Jessica A. Palmer; David Pamies; Jie Shen; Volker Strauss; Shengde Wu; Thomas Hartung

Read-across, i.e. filling toxicological data gaps by relating to similar chemicals, for which test data are available, is usually done based on chemical similarity. Besides structure and physico-chemical properties, however, biological similarity based on biological data adds extra strength to this process. In the context of developing Good Read-Across Practice guidance, a number of case studies were evaluated to demonstrate the use of biological data to enrich read-across. In the simplest case, chemically similar substances also show similar test results in relevant in vitro assays. This is a well-established method for the read-across of e.g. genotoxicity assays. Larger datasets of biological and toxicological properties of hundreds and thousands of substances become increasingly available enabling big data approaches in read-across studies. Several case studies using various big data sources are described in this paper. An example is given for the US EPAs ToxCast dataset allowing read-across for high quality uterotrophic assays for estrogenic endocrine disruption. Similarly, an example for REACH registration data enhancing read-across for acute toxicity studies is given. A different approach is taken using omics data to establish biological similarity: Examples are given for stem cell models in vitro and short-term repeated dose studies in rats in vivo to support read-across and category formation. These preliminary biological data-driven read-across studies highlight the road to the new generation of read-across approaches that can be applied in chemical safety assessment.Summary Read-across, i.e. filling toxicological data gaps by relating to similar chemicals, for which test data are available, is usually done based on chemical similarity. Besides structure and physico-chemical properties, however, biological similarity based on biological data adds extra strength to this process. In the context of developing Good Read-Across Practice guidance, a number of case studies were evaluated to demonstrate the use of biological data to enrich read-across. In the simplest case, chemically similar substances also show similar test results in relevant in vitro assays. This is a well-established method for the read-across of e.g. genotoxicity assays. Larger datasets of biological and toxicological properties of hundreds and thousands of substances become increasingly available enabling big data approaches in read-across studies. Several case studies using various big data sources are described in this paper. An example is given for the US EPA’s ToxCast dataset allowing read-across for high quality uterotrophic assays for estrogenic endocrine disruption. Similarly, an example for REACH registration data enhancing read-across for acute toxicity studies is given. A different approach is taken using omics data to establish biological similarity: Examples are given for stem cell models in vitro and short-term repeated dose studies in rats in vivo to support read-across and category formation. These preliminary biological data-driven read-across studies highlight the road to the new generation of read-across approaches that can be applied in chemical safety assessment.


Journal of Applied Toxicology | 2013

Review: Toxicometabolomics: Toxicometabolomics

Mounir Bouhifd; Thomas Hartung; Helena T. Hogberg; Andre Kleensang; Liang Zhao

Metabolomics use in toxicology is rapidly increasing, particularly owing to advances in mass spectroscopy, which is widely used in the life sciences for phenotyping disease states. Toxicology has the advantage of having the disease agent, the toxicant, available for experimental induction of metabolomics changes monitored over time and dose. This review summarizes the different technologies employed and gives examples of their use in various areas of toxicology. A prominent use of metabolomics is the identification of signatures of toxicity – patterns of metabolite changes predictive of a hazard manifestation. Increasingly, such signatures indicative of a certain hazard manifestation are identified, suggesting that certain modes of action result in specific derangements of the metabolism. This might enable the deduction of underlying pathways of toxicity, which, in their entirety, form the Human Toxome, a key concept for implementing the vision of Toxicity Testing for the 21st century. This review summarizes the current state of metabolomics technologies and principles, their uses in toxicology and gives a thorough overview on metabolomics bioinformatics, pathway identification and quality assurance. In addition, this review lays out the prospects for further metabolomics application also in a regulatory context. Copyright


ALTEX-Alternatives to Animal Experimentation | 2015

The Human Toxome Project

Mounir Bouhifd; Melvin E. Andersen; Christina Baghdikian; Kim Boekelheide; Kevin M. Crofton; Albert J. Fornace; Andre Kleensang; Heng-Hong Li; Carolina B. Livi; Alexandra Maertens; Patrick D. McMullen; Michael Rosenberg; Russell S. Thomas; Marguerite M. Vantangoli; James D. Yager; Liang Zhao; Thomas Hartung

The Human Toxome Project, funded as an NIH Transformative Research grant 2011-2016, is focused on developing the concepts and the means for deducing, validating and sharing molecular pathways of toxicity (PoT). Using the test case of estrogenic endocrine disruption, the responses of MCF-7 human breast cancer cells are being phenotyped by transcriptomics and mass-spectroscopy-based metabolomics. The bioinformatics tools for PoT deduction represent a core deliverable. A number of challenges for quality and standardization of cell systems, omics technologies and bioinformatics are being addressed. In parallel, concepts for annotation, validation and sharing of PoT information, as well as their link to adverse outcomes, are being developed. A reasonably comprehensive public database of PoT, the Human Toxome Knowledge-base, could become a point of reference for toxicological research and regulatory test strategies.


Scientific Reports | 2016

Genetic variability in a frozen batch of MCF-7 cells invisible in routine authentication affecting cell function

Andre Kleensang; Marguerite M. Vantangoli; Shelly Odwin-DaCosta; Melvin E. Andersen; Kim Boekelheide; Mounir Bouhifd; Albert J. Fornace; Heng Hong Li; Carolina B. Livi; Samantha J. Madnick; Alexandra Maertens; Michael Rosenberg; James D. Yager; Liang Zhaog; Thomas Hartung

Common recommendations for cell line authentication, annotation and quality control fall short addressing genetic heterogeneity. Within the Human Toxome Project, we demonstrate that there can be marked cellular and phenotypic heterogeneity in a single batch of the human breast adenocarcinoma cell line MCF-7 obtained directly from a cell bank that are invisible with the usual cell authentication by short tandem repeat (STR) markers. STR profiling just fulfills the purpose of authentication testing, which is to detect significant cross-contamination and cell line misidentification. Heterogeneity needs to be examined using additional methods. This heterogeneity can have serious consequences for reproducibility of experiments as shown by morphology, estrogenic growth dose-response, whole genome gene expression and untargeted mass-spectroscopy metabolomics for MCF-7 cells. Using Comparative Genomic Hybridization (CGH), differences were traced back to genetic heterogeneity already in the cells from the original frozen vials from the same ATCC lot, however, STR markers did not differ from ATCC reference for any sample. These findings underscore the need for additional quality assurance in Good Cell Culture Practice and cell characterization, especially using other methods such as CGH to reveal possible genomic heterogeneity and genetic drifts within cell lines.


ALTEX-Alternatives to Animal Experimentation | 2015

Quality Assurance of Metabolomics

Mounir Bouhifd; Richard D. Beger; Thomas J. Flynn; Lining Guo; Georgina Harris; Helena T. Hogberg; Rima Kaddurah-Daouk; Hennicke Kamp; Andre Kleensang; Alexandra Maertens; Shelly Odwin-DaCosta; David Pamies; Donald G. Robertson; Lena Smirnova; Jinchun Sun; Liang Zhao; Thomas Hartung

Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however - from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining - is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools.


Frontiers in Pharmacology | 2016

The Human Toxome Collaboratorium: A Shared Environment for Multi-Omic Computational Collaboration within a Consortium

Rick A. Fasani; Carolina B. Livi; Dipanwita R. Choudhury; Andre Kleensang; Mounir Bouhifd; Salil N. Pendse; Patrick D. McMullen; Melvin E. Andersen; Thomas Hartung; Michael Rosenberg

The Human Toxome Project is part of a long-term vision to modernize toxicity testing for the 21st century. In the initial phase of the project, a consortium of six academic, commercial, and government organizations has partnered to map pathways of toxicity, using endocrine disruption as a model hazard. Experimental data is generated at multiple sites, and analyzed using a range of computational tools. While effectively gathering, managing, and analyzing the data for high-content experiments is a challenge in its own right, doing so for a growing number of -omics technologies, with larger data sets, across multiple institutions complicates the process. Interestingly, one of the most difficult, ongoing challenges has been the computational collaboration between the geographically separate institutions. Existing solutions cannot handle the growing heterogeneous data, provide a computational environment for consistent analysis, accommodate different workflows, and adapt to the constantly evolving methods and goals of a research project. To meet the needs of the project, we have created and managed The Human Toxome Collaboratorium, a shared computational environment hosted on third-party cloud services. The Collaboratorium provides a familiar virtual desktop, with a mix of commercial, open-source, and custom-built applications. It shares some of the challenges of traditional information technology, but with unique and unexpected constraints that emerge from the cloud. Here we describe the problems we faced, the current architecture of the solution, an example of its use, the major lessons we learned, and the future potential of the concept. In particular, the Collaboratorium represents a novel distribution method that could increase the reproducibility and reusability of results from similar large, multi-omic studies.


Basic & Clinical Pharmacology & Toxicology | 2014

Mapping the Human Toxome by Systems Toxicology

Mounir Bouhifd; Helena T. Hogberg; Andre Kleensang; Alexandra Maertens; Liang Zhao; Thomas Hartung


ALTEX-Alternatives to Animal Experimentation | 2014

t4 Workshop Report: Pathways of Toxicity

Andre Kleensang; Alexandra Maertens; Michael Rosenberg; Suzanne Fitzpatrick; Justin Lamb; Scott S. Auerbach; Richard Brennan; Kevin M. Crofton; Ben Gordon; Albert J. Fornace; Kevin W. Gaido; David Gerhold; Robin Haw; Adriano Henney; Avi Ma'ayan; Mary T. McBride; Stefano Monti; Michael F. Ochs; Akhilesh Pandey; Roded Sharan; R.H. Stierum; Stuart Tugendreich; Catherine Willett; Clemens Wittwehr; Jianguo Xia; Geoffrey W. Patton; Kirk Arvidson; Mounir Bouhifd; Helena T. Hogberg; Thomas Luechtefeld


ALTEX-Alternatives to Animal Experimentation | 2014

Pathways of Toxicity

Andre Kleensang; Alexandra Maertens; Michael Rosenberg; Suzanne Fitzpatrick; Justin Lamb; Scott S. Auerbach; Richard Brennan; Kevin M. Crofton; Ben Gordon; Albert J. Fornace; Kevin W. Gaido; David Gerhold; Robin Haw; Adriano Henney; Avi Ma’ayan; Mary T. McBride; Stefano Monti; Michael F. Ochs; Akhilesh Pandey; Roded Sharan; R.H. Stierum; Stuart Tugendreich; Catherine Willett; Clemens Wittwehr; Jianguo Xia; Geoffrey W. Patton; Kirk Arvidson; Mounir Bouhifd; Helena T. Hogberg; Thomas Luechtefeld

Collaboration


Dive into the Mounir Bouhifd's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas Hartung

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Liang Zhao

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Pamies

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Kevin M. Crofton

United States Environmental Protection Agency

View shared research outputs
Researchain Logo
Decentralizing Knowledge