Andre Kleensang
Johns Hopkins University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Andre Kleensang.
ALTEX-Alternatives to Animal Experimentation | 2016
Nicholas Ball; Mark T. D. Cronin; Jie Shen; Karen Blackburn; Ewan D. Booth; Mounir Bouhifd; Elizabeth L.R. Donley; Laura A. Egnash; Charles Hastings; D.R. Juberg; Andre Kleensang; Nicole Kleinstreuer; E.D. Kroese; A.C. Lee; Thomas Luechtefeld; Alexandra Maertens; S. Marty; Jorge M. Naciff; Jessica A. Palmer; David Pamies; M. Penman; Andrea-Nicole Richarz; Daniel P. Russo; Sharon B. Stuard; G. Patlewicz; B. van Ravenzwaay; Shengde Wu; Hao Zhu; Thomas Hartung
Summary Grouping of substances and utilizing read-across of data within those groups represents an important data gap filling technique for chemical safety assessments. Categories/analogue groups are typically developed based on structural similarity and, increasingly often, also on mechanistic (biological) similarity. While read-across can play a key role in complying with legislation such as the European REACH regulation, the lack of consensus regarding the extent and type of evidence necessary to support it often hampers its successful application and acceptance by regulatory authorities. Despite a potentially broad user community, expertise is still concentrated across a handful of organizations and individuals. In order to facilitate the effective use of read-across, this document presents the state of the art, summarizes insights learned from reviewing ECHA published decisions regarding the relative successes/pitfalls surrounding read-across under REACH, and compiles the relevant activities and guidance documents. Special emphasis is given to the available existing tools and approaches, an analysis of ECHAs published final decisions associated with all levels of compliance checks and testing proposals, the consideration and expression of uncertainty, the use of biological support data, and the impact of the ECHA Read-Across Assessment Framework (RAAF) published in 2015.
ALTEX-Alternatives to Animal Experimentation | 2013
Thomas Hartung; Tom Luechtefeld; Alexandra Maertens; Andre Kleensang
Despite the fact that toxicology uses many stand-alone tests, a systematic combination of several information sources very often is required: Examples include: when not all possible outcomes of interest (e.g., modes of action), classes of test substances (applicability domains), or severity classes of effect are covered in a single test; when the positive test result is rare (low prevalence leading to excessive false-positive results); when the gold standard test is too costly or uses too many animals, creating a need for prioritization by screening. Similarly, tests are combined when the human predictivity of a single test is not satisfactory or when existing data and evidence from various tests will be integrated. Increasingly, kinetic information also will be integrated to make an in vivo extrapolation from in vitro data. Integrated Testing Strategies (ITS) offer the solution to these problems. ITS have been discussed for more than a decade, and some attempts have been made in test guidance for regulations. Despite their obvious potential for revamping regulatory toxicology, however, we still have little guidance on the composition, validation, and adaptation of ITS for different purposes. Similarly, Weight of Evidence and Evidence-based Toxicology approaches require different pieces of evidence and test data to be weighed and combined. ITS also represent the logical way of combining pathway-based tests, as suggested in Toxicology for the 21st Century. This paper describes the state of the art of ITS and makes suggestions as to the definition, systematic combination, and quality assurance of ITS.
Journal of Applied Toxicology | 2013
Mounir Bouhifd; Thomas Hartung; Helena T. Hogberg; Andre Kleensang; Liang Zhao
Metabolomics use in toxicology is rapidly increasing, particularly owing to advances in mass spectroscopy, which is widely used in the life sciences for phenotyping disease states. Toxicology has the advantage of having the disease agent, the toxicant, available for experimental induction of metabolomics changes monitored over time and dose. This review summarizes the different technologies employed and gives examples of their use in various areas of toxicology. A prominent use of metabolomics is the identification of signatures of toxicity – patterns of metabolite changes predictive of a hazard manifestation. Increasingly, such signatures indicative of a certain hazard manifestation are identified, suggesting that certain modes of action result in specific derangements of the metabolism. This might enable the deduction of underlying pathways of toxicity, which, in their entirety, form the Human Toxome, a key concept for implementing the vision of Toxicity Testing for the 21st century. This review summarizes the current state of metabolomics technologies and principles, their uses in toxicology and gives a thorough overview on metabolomics bioinformatics, pathway identification and quality assurance. In addition, this review lays out the prospects for further metabolomics application also in a regulatory context. Copyright
Regulatory Toxicology and Pharmacology | 2014
Francois Busquet; Ruben Strecker; Jane M. Rawlings; Scott E. Belanger; Thomas Braunbeck; Gregory J. Carr; P.H. Cenijn; Przemyslaw Fochtman; Anne Gourmelon; Nicole Hübler; Andre Kleensang; Melanie Knöbel; Carola Kussatz; Juliette Legler; Adam Lillicrap; Fernando Martínez-Jerónimo; Christian Polleichtner; Helena Rzodeczko; Edward Salinas; Katharina Schneider; Stefan Scholz; Evert-Jan van den Brandhof; Leo T.M. van der Ven; Susanne Walter-Rohde; Stefan Weigt; Hilda Witters; Marlies Halder
The OECD validation study of the zebrafish embryo acute toxicity test (ZFET) for acute aquatic toxicity testing evaluated the ZFET reproducibility by testing 20 chemicals at 5 different concentrations in 3 independent runs in at least 3 laboratories. Stock solutions and test concentrations were analytically confirmed for 11 chemicals. Newly fertilised zebrafish eggs (20/concentration and control) were exposed for 96h to chemicals. Four apical endpoints were recorded daily as indicators of acute lethality: coagulation of the embryo, lack of somite formation, non-detachment of the tail bud from the yolk sac and lack of heartbeat. Results (LC50 values for 48/96h exposure) show that the ZFET is a robust method with a good intra- and inter-laboratory reproducibility (CV<30%) for most chemicals and laboratories. The reproducibility was lower (CV>30%) for some very toxic or volatile chemicals, and chemicals tested close to their limit of solubility. The ZFET is now available as OECD Test Guideline 236. Considering the high predictive capacity of the ZFET demonstrated by Belanger et al. (2013) in their retrospective analysis of acute fish toxicity and fish embryo acute toxicity data, the ZFET is ready to be considered for acute fish toxicity for regulatory purposes.
Scientific Reports | 2017
Aleksander Skardal; Sean V. Murphy; Mahesh Devarasetty; Ivy Mead; Hyun Wook Kang; Young Joon Seol; Yu Shrike Zhang; Su Ryon Shin; Liang Zhao; Julio Aleman; Adam R. Hall; Thomas Shupe; Andre Kleensang; Mehmet R. Dokmeci; Sang Jin Lee; John Jackson; James J. Yoo; Thomas Hartung; Ali Khademhosseini; Shay Soker; Colin E. Bishop; Anthony Atala
Many drugs have progressed through preclinical and clinical trials and have been available – for years in some cases – before being recalled by the FDA for unanticipated toxicity in humans. One reason for such poor translation from drug candidate to successful use is a lack of model systems that accurately recapitulate normal tissue function of human organs and their response to drug compounds. Moreover, tissues in the body do not exist in isolation, but reside in a highly integrated and dynamically interactive environment, in which actions in one tissue can affect other downstream tissues. Few engineered model systems, including the growing variety of organoid and organ-on-a-chip platforms, have so far reflected the interactive nature of the human body. To address this challenge, we have developed an assortment of bioengineered tissue organoids and tissue constructs that are integrated in a closed circulatory perfusion system, facilitating inter-organ responses. We describe a three-tissue organ-on-a-chip system, comprised of liver, heart, and lung, and highlight examples of inter-organ responses to drug administration. We observe drug responses that depend on inter-tissue interaction, illustrating the value of multiple tissue integration for in vitro study of both the efficacy of and side effects associated with candidate drugs.
ALTEX-Alternatives to Animal Experimentation | 2015
Mounir Bouhifd; Melvin E. Andersen; Christina Baghdikian; Kim Boekelheide; Kevin M. Crofton; Albert J. Fornace; Andre Kleensang; Heng-Hong Li; Carolina B. Livi; Alexandra Maertens; Patrick D. McMullen; Michael Rosenberg; Russell S. Thomas; Marguerite M. Vantangoli; James D. Yager; Liang Zhao; Thomas Hartung
The Human Toxome Project, funded as an NIH Transformative Research grant 2011-2016, is focused on developing the concepts and the means for deducing, validating and sharing molecular pathways of toxicity (PoT). Using the test case of estrogenic endocrine disruption, the responses of MCF-7 human breast cancer cells are being phenotyped by transcriptomics and mass-spectroscopy-based metabolomics. The bioinformatics tools for PoT deduction represent a core deliverable. A number of challenges for quality and standardization of cell systems, omics technologies and bioinformatics are being addressed. In parallel, concepts for annotation, validation and sharing of PoT information, as well as their link to adverse outcomes, are being developed. A reasonably comprehensive public database of PoT, the Human Toxome Knowledge-base, could become a point of reference for toxicological research and regulatory test strategies.
Scientific Reports | 2016
Andre Kleensang; Marguerite M. Vantangoli; Shelly Odwin-DaCosta; Melvin E. Andersen; Kim Boekelheide; Mounir Bouhifd; Albert J. Fornace; Heng Hong Li; Carolina B. Livi; Samantha J. Madnick; Alexandra Maertens; Michael Rosenberg; James D. Yager; Liang Zhaog; Thomas Hartung
Common recommendations for cell line authentication, annotation and quality control fall short addressing genetic heterogeneity. Within the Human Toxome Project, we demonstrate that there can be marked cellular and phenotypic heterogeneity in a single batch of the human breast adenocarcinoma cell line MCF-7 obtained directly from a cell bank that are invisible with the usual cell authentication by short tandem repeat (STR) markers. STR profiling just fulfills the purpose of authentication testing, which is to detect significant cross-contamination and cell line misidentification. Heterogeneity needs to be examined using additional methods. This heterogeneity can have serious consequences for reproducibility of experiments as shown by morphology, estrogenic growth dose-response, whole genome gene expression and untargeted mass-spectroscopy metabolomics for MCF-7 cells. Using Comparative Genomic Hybridization (CGH), differences were traced back to genetic heterogeneity already in the cells from the original frozen vials from the same ATCC lot, however, STR markers did not differ from ATCC reference for any sample. These findings underscore the need for additional quality assurance in Good Cell Culture Practice and cell characterization, especially using other methods such as CGH to reveal possible genomic heterogeneity and genetic drifts within cell lines.
Archives of Toxicology | 2015
Alexandra Maertens; Thomas Luechtefeld; Andre Kleensang; Thomas Hartung
Deriving a Pathway of Toxicity from transcriptomic data remains a challenging task. We explore the use of weighted gene correlation network analysis (WGCNA) to extract an initial network from a small microarray study of MPTP toxicity in mice. Five modules were statistically significant; each module was analyzed for gene signatures in the Chemical and Genetic Perturbation subset of the Molecular Signatures Database as well as for over-represented transcription factor binding sites and WGCNA clustered probes by function and captured pathways relevant to neurodegenerative disorders. The resulting network was analyzed for transcription factor candidates, which were narrowed down via text-mining for relevance to the disease model, and then combined with the large-scale interaction FANTOM4 database to generate a genetic regulatory network. Modules were enriched for transcription factors relevant to Parkinson’s disease. Transcription factors significantly improved the number of genes that could be connected in a given component. For each module, the transcription factor that had, by far, the highest number of interactions was SP1, and it also had substantial experimental evidence of interactions. This analysis both captures much of the known biology of MPTP toxicity and suggests several candidates for further study. Furthermore, the analysis strongly suggests that SP1 plays a central role in coordinating the cellular response to MPTP toxicity.
ALTEX-Alternatives to Animal Experimentation | 2015
Mounir Bouhifd; Richard D. Beger; Thomas J. Flynn; Lining Guo; Georgina Harris; Helena T. Hogberg; Rima Kaddurah-Daouk; Hennicke Kamp; Andre Kleensang; Alexandra Maertens; Shelly Odwin-DaCosta; David Pamies; Donald G. Robertson; Lena Smirnova; Jinchun Sun; Liang Zhao; Thomas Hartung
Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however - from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining - is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools.
Mutation Research-genetic Toxicology and Environmental Mutagenesis | 2012
Sebastian Hoffmann; Ludwig A. Hothorn; Lutz Edler; Andre Kleensang; Masaya Suzuki; Pascal Phrakonkham; Daniel Gerhard
Validation activities of the BALB/c 3T3 cell transformation assay (CTA) - a test method used for the assessment of the carcinogenic potential of compounds - have revealed the need for statistical analysis tailored to specific features of BALB/c 3T3 CTA data. Whereas a standard statistical approach for the Syrian hamster embryo (SHE) CTA was considered sufficient, an international expert group was gathered by the European Centre for the Validation of Alternative Methods (ECVAM) to review commonly applied statistical approaches for BALB/c 3T3 CTA. As it was concluded that none of the commonly applied approaches is entirely appropriate, two novel statistical approaches were found to be recommended for the evaluation of BALB/c 3T3 CTA data accounting for possible non-monotone concentration-response relationship and variance heterogeneity: a negative binomial generalised linear model with Williams-type downturn-protected trend tests and a normalisation of the data by a specific transformation allowing for application of a general linear model that estimates effects assuming a normal distribution with Williams-type protected tests. Both approaches are described in this article and their performance and the quality of the results they generate is demonstrated using exemplary data. Our work confirmed that both approaches are suitable for the statistical analysis of BALB/c 3T3 CTA data and that each of them is superior to commonly used methods. Furthermore, a procedure dichotomising data into negatives and positives is proposed which allows re-testing in cases where inconclusive data are encountered. The scripts of the statistical evaluation programs written in R - a freely available statistical software - are appended including exemplary outputs (Appendix A).