Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Catherine Willett is active.

Publication


Featured researches published by Catherine Willett.


Toxicological Sciences | 2013

Development of an Adverse Outcome Pathway From Drug-Mediated Bile Salt Export Pump Inhibition to Cholestatic Liver Injury

Mathieu Vinken; Brigitte Landesmann; Marina Goumenou; Stefanie Vinken; Imran Shah; Hartmut Jaeschke; Catherine Willett; Maurice Whelan; Vera Rogiers

Adverse outcome pathways (AOPs) have been recently introduced in human risk assessment as pragmatic tools with multiple applications. As such, AOPs intend to provide a clear-cut mechanistic representation of pertinent toxicological effects. AOPs are typically composed of a molecular initiating event, a series of intermediate steps and key events, and an adverse outcome. In this study, an AOP framework is proposed for cholestasis triggered by drug-mediated inhibition of the bile salt export pump transporter protein. For this purpose, an in-depth survey of relevant scientific literature was carried out in order to identify intermediate steps and key events. The latter include bile accumulation, the induction of oxidative stress and inflammation, and the activation of specific nuclear receptors. Collectively, these mechanisms drive both a deteriorative cellular response, which underlies directly caused cholestatic injury, and an adaptive cellular response, which is aimed at counteracting cholestatic insults. AOP development was performed according to Organisation for Economic Co-operation and Development (OECD) guidance, including critical consideration of the Bradford Hill criteria for weight of evidence assessment and the OECD key questions for evaluating AOP confidence. The postulated AOP is expected to serve as the basis for the development of new in vitro tests and the characterization of novel biomarkers of drug-induced cholestasis.


Environmental Health Perspectives | 2015

Lessons from Toxicology: Developing a 21st-Century Paradigm for Medical Research

Gill Langley; Christopher P. Austin; Anil K. Balapure; Linda S. Birnbaum; John R. Bucher; Julia Fentem; Suzanne Fitzpatrick; John R. Fowle; Robert J. Kavlock; Hiroaki Kitano; Brett A. Lidbury; Alysson R. Muotri; Shuangqing Peng; D. A. Sakharov; Troy Seidle; Thales Trez; Alexander G. Tonevitsky; Anja van de Stolpe; Maurice Whelan; Catherine Willett

Summary Biomedical developments in the 21st century provide an unprecedented opportunity to gain a dynamic systems-level and human-specific understanding of the causes and pathophysiologies of disease. This understanding is a vital need, in view of continuing failures in health research, drug discovery, and clinical translation. The full potential of advanced approaches may not be achieved within a 20th-century conceptual framework dominated by animal models. Novel technologies are being integrated into environmental health research and are also applicable to disease research, but these advances need a new medical research and drug discovery paradigm to gain maximal benefits. We suggest a new conceptual framework that repurposes the 21st-century transition underway in toxicology. Human disease should be conceived as resulting from integrated extrinsic and intrinsic causes, with research focused on modern human-specific models to understand disease pathways at multiple biological levels that are analogous to adverse outcome pathways in toxicology. Systems biology tools should be used to integrate and interpret data about disease causation and pathophysiology. Such an approach promises progress in overcoming the current roadblocks to understanding human disease and successful drug discovery and translation. A discourse should begin now to identify and consider the many challenges and questions that need to be solved.


Toxicological Sciences | 2011

Application of an Integrated Testing Strategy to the U.S. EPA Endocrine Disruptor Screening Program

Catherine Willett; Patricia L. Bishop; Kristie M. Sullivan

New approaches to generating and evaluating toxicity data for chemicals are needed to cope with the ever-increasing demands of new programs. One such approach involves the use of an integrated testing and evaluation strategy based on the specific properties and activities of a chemical. Such an integrated strategy, whether applied to existing or future programs, can promote efficient use of resources and save animals. We demonstrate the utility of such a strategy by applying it to the current U.S. Environmental Protection Agency Endocrine Disruptor Screening Program (EDSP). Launched in October 2009, the EDSP utilizes a two-tiered approach, whereby each tier requires a battery of animal-intensive and expensive tests. Tier 1 consists of five in vitro and six in vivo assays that are intended to determine a chemicals potential to interact with the estrogen (E), androgen (A), or thyroid (T) hormone pathways. Tier 2 is proposed to consist of multigenerational reproductive and developmental toxicity tests in several species and is intended to determine whether a chemical can cause adverse effects resulting from E, A, or T modulation. In contrast to the existing EDSP structure, we show, using the pesticide atrazine as an example, that a multilevel testing framework combined with an integrated evaluation process would significantly increase efficiency by minimizing testing.


Drug Discovery Today | 2017

Towards a 21st-century roadmap for biomedical research and drug discovery: Consensus report and recommendations

Gillian R. Langley; Ian M. Adcock; François Busquet; Kevin M. Crofton; Elena Csernok; Christoph Giese; Tuula Heinonen; Kathrin Herrmann; Martin Hofmann-Apitius; Brigitte Landesmann; Lindsay J. Marshall; Emily McIvor; Alysson R. Muotri; Fozia Noor; Katrin Schutte; Troy Seidle; Anja van de Stolpe; Hilde Van Esch; Catherine Willett; Grzegorz Woszczek

Decades of costly failures in translating drug candidates from preclinical disease models to human therapeutic use warrant reconsideration of the priority placed on animal models in biomedical research. Following an international workshop attended by experts from academia, government institutions, research funding bodies, and the corporate and non-governmental organisation (NGO) sectors, in this consensus report, we analyse, as case studies, five disease areas with major unmet needs for new treatments. In view of the scientifically driven transition towards a human pathways-based paradigm in toxicology, a similar paradigm shift appears to be justified in biomedical research. There is a pressing need for an approach that strategically implements advanced, human biology-based models and tools to understand disease pathways at multiple biological scales. We present recommendations to help achieve this.


Environmental Toxicology and Chemistry | 2017

Advancing the adverse outcome pathway framework—An international horizon scanning approach

Carlie A. LaLone; Gerald Ankley; Scott E. Belanger; Michelle R. Embry; Geoff Hodges; Dries Knapen; Sharon Munn; Edward J. Perkins; Murray A. Rudd; Daniel L. Villeneuve; Maurice Whelan; Catherine Willett; Xiaowei Zhang; Markus Hecker

Our ability to conduct whole-organism toxicity tests to understand chemical safety has been outpaced by the synthesis of new chemicals for a wide variety of commercial applications. As a result, scientists and risk assessors are turning to mechanistically based studies to increase efficiencies in chemical risk assessment and making greater use of in vitro and in silico methods to evaluate potential environmental and human health hazards. In this context, the adverse outcome pathway (AOP) framework has gained traction in regulatory science because it offers an efficient and effective means for capturing available knowledge describing the linkage between mechanistic data and the apical toxicity end points required for regulatory assessments. A number of international activities have focused on AOP development and various applications to regulatory decision-making. These initiatives have prompted dialogue between research scientists and regulatory communities to consider how best to use the AOP framework. Although expert-facilitated discussions and AOP development have been critical in moving the science of AOPs forward, it was recognized that a survey of the broader scientific and regulatory communities would aid in identifying current limitations while guiding future initiatives for the AOP framework. To that end, a global horizon scanning exercise was conducted to solicit questions concerning the challenges or limitations that must be addressed to realize the full potential of the AOP framework in research and regulatory decision-making. The questions received fell into several broad topical areas: AOP networks, quantitative AOPs, collaboration on and communication of AOP knowledge, AOP discovery and development, chemical and cross-species extrapolation, exposure/toxicokinetics considerations, and AOP applications. Expert ranking was then used to prioritize questions for each category, where 4 broad themes emerged that could help inform and guide future AOP research and regulatory initiatives. In addition, frequently asked questions were identified and addressed by experts in the field. Answers to frequently asked questions will aid in addressing common misperceptions and will allow for clarification of AOP topics. The need for this type of clarification was highlighted with surprising frequency by our question submitters, indicating that improvements are needed in communicating the AOP framework among the scientific and regulatory communities. Overall, horizon scanning engaged the global scientific community to help identify key questions surrounding the AOP framework and guide the direction of future initiatives. Environ Toxicol Chem 2017;36:1411-1421.


Sar and Qsar in Environmental Research | 2014

Building on a solid foundation: SAR and QSAR as a fundamental strategy to reduce animal testing

Kristie M. Sullivan; J.R. Manuppello; Catherine Willett

The development of more efficient, ethical, and effective means of assessing the effects of chemicals on human health and the environment was a lifetime goal of Gilman Veith. His work has provided the foundation for the use of chemical structure for informing toxicological assessment by regulatory agencies the world over. Veith’s scientific work influenced the early development of the SAR models in use today at the US Environmental Protection Agency. He was the driving force behind the Organisation for Economic Co-operation and Development QSAR Toolbox. Veith was one of a few early pioneers whose vision led to the linkage of chemical structure and biological activity as a means of predicting adverse apical outcomes (known as a mode of action, or an adverse outcome pathway approach), and he understood at an early stage the power that could be harnessed when combining computational and mechanistic biological approaches as a means of avoiding animal testing. Through the International QSAR Foundation he organized like-minded experts to develop non-animal methods and frameworks for the assessment of chemical hazard and risk for the benefit of public and environmental health. Avoiding animal testing was Gil’s passion, and his work helped to initiate the paradigm shift in toxicology that is now rendering this feasible.


Environmental Health Perspectives | 2008

Longer rodent bioassay fails to address 2-year bioassay's flaws.

Joseph R. Manuppello; Catherine Willett

In their commentary, Huff et al. (2008) proposed that exposing experimental animals to test substances in utero and for 30 months or until their natural deaths increases the sensitivity of bioassays, avoids false negative results, and strengthens the value and validity of results. Instead, longer exposure results in increased numbers of spontaneously arising tumors, as well as increased cost and animal suffering, while failing to address the bioassay’s fundamental flaws. Although it is troubling when the bioassay produces false negative results, a far more pervasive problem is that of false positives. In our analysis of > 500 National Toxicology Program (NTP) bioassays [People for the Ethical Treatment of Animals (PETA) 2006], we found that more than half of the substances evaluated (259) produced evidence of carcinogenicity in at least one group of animals, but only about one-third of these (89) were subsequently classified as known or probable human carcinogens by the NTP itself. Even fewer of the substances, 40 and 16, respectively, were classified as carcinogens by the U.S. Environmental Protection Agency (EPA) and the International Agency for Research on Cancer (PETA 2006). This high false-positive rate is thought to be largely an indirect effect of increased cell proliferation in response to cell injury and death caused by the near toxic doses of test substances used in the bioassay (Gaylor 2005). Species-specific modes of action operating in rats or mice but not in humans, such as those mediated by 2μ-globulin, peroxisomes, and thyroid-stimulating hormone, also contribute to the high rate of false positives (Cohen 2004). Huff et al. (2008) asserted that one of the “well-accepted observations” upon which the “relevance of experimental bioassays to humans” rests is that “findings from independently conducted bioassays on the same chemicals are consistent.” In fact, in a comparison of 121 bioassays from the NTP database with those in the published scientific literature, Gottmann et al. (2001) found that the studies produced consistent results only 57% of the time. Huff et al. (2008) cited questions about the safety of aspartame raised by 3-year bioassays conducted by the Ramazzini Foundation to support their conclusions. Although they noted that the European Food Safety Authority (FSA) and the U.S. Food and Drug Administration (FDA) dispute these studies’ conclusions, the FSA’s Committee on Carcinogenicity (COC 2006) observed that In view of the inadequacies in design of the [Ramazzini Foundation] study and the use of rats with a high concurrent infection rate, the COC considered that no valid conclusions could be derived from it. Further, the COC noted that groups of animals fed aspartame had lower body weights and thus lived longer, which may have compromised the results by leading to an apparent increase in spontaneously arising tumors. Considering that lower body weights are typically observed among animals in the bioassay’s experimental groups, this is likely to generally confound the interpretation of longer bioassays. We must stress that animals suffer during the bioassay: They live in the barren, stressful conditions of the laboratory—often including daily forced feeding or inhalation—and many also suffer from exposure to near toxic doses of test substances. These exposures often produce lethargy, anemia, diarrhea, weight loss, and other symptoms of sickness and distress. The proposal of Huff et al. (2008) to extend the length of the bioassay would obviously result in a proportional increase in this suffering. Further, extending the bioassay runs counter to current trends in regulatory testing. Concern for the suffering of animals has caused regulatory agencies to review the usefulness of long-term studies, resulting in elimination of the 1-year dog toxicity test (U.S. EPA 2007) and an international effort to replace the two-generation reproductive toxicity test (Cooper et al. 2006). Huff et al.’s proposal thus clearly represents a step backward for toxicological science. According to the NTP’s own estimates, each bioassay requires 5 years to plan, conduct, and evaluate; 860 animals to be killed; and


Environmental Health Perspectives | 2012

Animal use and lessons learned in the U.S. High Production Volume Chemicals Challenge Program.

Patricia L. Bishop; Joseph R. Manuppello; Catherine Willett; Jessica T. Sandler

2–


Birth Defects Research Part B-developmental and Reproductive Toxicology | 2014

The Use and Acceptance of Other Scientifically Relevant Information (OSRI) in the U.S. Environmental Protection Agency (EPA) Endocrine Disruptor Screening Program

Patricia L. Bishop; Catherine Willett

4 million. As a result, the NTP has conducted an average of only 12 bioassays/year over the past several decades. Considering that humans are thought to be exposed to approximately 80,000 environmental toxicants (Ward et al. 2003), it would take more than 32 millenia, 68 million animals, and


Science of The Total Environment | 2018

Harvesting the promise of AOPs: An assessment and recommendations

Annamaria Carusi; Mark Davies; Giovanni De Grandis; Beate I. Escher; Geoff Hodges; Kenneth M.Y. Leung; Maurice Whelan; Catherine Willett; Gerald T. Ankley

160 billion to test them all at this rate. Once again, extending the length of the bioassay would only increase these already ridiculous numbers. The time has clearly come for antiquated animal tests such as the bioassay to be abandoned in favor of modern, human-relevant methods such as epidemiologic studies, high-throughput in vitro methods, and computational toxicology.

Collaboration


Dive into the Catherine Willett's collaboration.

Top Co-Authors

Avatar

Suzanne Fitzpatrick

Food and Drug Administration

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kevin M. Crofton

United States Environmental Protection Agency

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Akhilesh Pandey

Johns Hopkins University School of Medicine

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ben Gordon

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

David Gerhold

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Geoffrey W. Patton

Center for Food Safety and Applied Nutrition

View shared research outputs
Researchain Logo
Decentralizing Knowledge