Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kerry L. Dearfield is active.

Publication


Featured researches published by Kerry L. Dearfield.


Environmental and Molecular Mutagenesis | 2013

Quantitative approaches for assessing dose–response relationships in genetic toxicology studies

B. Bhaskar Gollapudi; George E. Johnson; Lya G. Hernández; Lynn H. Pottenger; Kerry L. Dearfield; Alan M. Jeffrey; E. Julien; James H. Kim; David P. Lovell; James T. MacGregor; Martha M. Moore; J. van Benthem; Paul A. White; Errol Zeiger; Véronique Thybaud

Genetic toxicology studies are required for the safety assessment of chemicals. Data from these studies have historically been interpreted in a qualitative, dichotomous “yes” or “no” manner without analysis of dose–response relationships. This article is based upon the work of an international multi‐sector group that examined how quantitative dose–response relationships for in vitro and in vivo genetic toxicology data might be used to improve human risk assessment. The group examined three quantitative approaches for analyzing dose–response curves and deriving point‐of‐departure (POD) metrics (i.e., the no‐observed‐genotoxic‐effect‐level (NOGEL), the threshold effect level (Td), and the benchmark dose (BMD)), using data for the induction of micronuclei and gene mutations by methyl methanesulfonate or ethyl methanesulfonate in vitro and in vivo. These results suggest that the POD descriptors obtained using the different approaches are within the same order of magnitude, with more variability observed for the in vivo assays. The different approaches were found to be complementary as each has advantages and limitations. The results further indicate that the lower confidence limit of a benchmark response rate of 10% (BMDL10) could be considered a satisfactory POD when analyzing genotoxicity data using the BMD approach. The models described permit the identification of POD values that could be combined with mode of action analysis to determine whether exposure(s) below a particular level constitutes a significant human risk. Subsequent analyses will expand the number of substances and endpoints investigated, and continue to evaluate the utility of quantitative approaches for analysis of genetic toxicity dose–response data. Environ. Mol. Mutagen., 2013.


Environmental and Molecular Mutagenesis | 2014

Derivation of point of departure (PoD) estimates in genetic toxicology studies and their potential applications in risk assessment

George E. Johnson; Lya G. Soeteman-Hernández; B. Bhaskar Gollapudi; Owen Bodger; Kerry L. Dearfield; Robert H. Heflich; J.G. Hixon; David P. Lovell; James T. MacGregor; Lynn H. Pottenger; C.M. Thompson; L. Abraham; Véronique Thybaud; Jennifer Y. Tanir; Errol Zeiger; J. van Benthem; Paul A. White

Genetic toxicology data have traditionally been employed for qualitative, rather than quantitative evaluations of hazard. As a continuation of our earlier report that analyzed ethyl methanesulfonate (EMS) and methyl methanesulfonate (MMS) dose–response data (Gollapudi et al., 2013), here we present analyses of 1‐ethyl‐1‐nitrosourea (ENU) and 1‐methyl‐1‐nitrosourea (MNU) dose–response data and additional approaches for the determination of genetic toxicity point‐of‐departure (PoD) metrics. We previously described methods to determine the no‐observed‐genotoxic‐effect‐level (NOGEL), the breakpoint‐dose (BPD; previously named Td), and the benchmark dose (BMD10) for genetic toxicity endpoints. In this study we employed those methods, along with a new approach, to determine the non‐linear slope‐transition‐dose (STD), and alternative methods to determine the BPD and BMD, for the analyses of nine ENU and 22 MNU datasets across a range of in vitro and in vivo endpoints. The NOGEL, BMDL10 and BMDL1SD PoD metrics could be readily calculated for most gene mutation and chromosomal damage studies; however, BPDs and STDs could not always be derived due to data limitations and constraints of the underlying statistical methods. The BMDL10 values were often lower than the other PoDs, and the distribution of BMDL10 values produced the lowest median PoD. Our observations indicate that, among the methods investigated in this study, the BMD approach is the preferred PoD for quantitatively describing genetic toxicology data. Once genetic toxicology PoDs are calculated via this approach, they can be used to derive reference doses and margin of exposure values that may be useful for evaluating human risk and regulatory decision making. Environ. Mol. Mutagen. 55:609–623, 2014.


Environmental and Molecular Mutagenesis | 2011

Follow-Up Actions from Positive Results of In Vitro Genetic Toxicity Testing

Kerry L. Dearfield; Véronique Thybaud; Michael C. Cimino; Laura Custer; Andreas Czich; James Harvey; Susan D. Hester; James H. Kim; David Kirkland; Dan D. Levy; Elisabeth Lorge; Martha M. Moore; Gladys Ouédraogo-Arras; Maik Schuler; Willi Suter; Kevin Sweder; Kirk Tarlo; Jan van Benthem; Freddy Van Goethem; Kristine L. Witt

Appropriate follow‐up actions and decisions are needed when evaluating and interpreting clear positive results obtained in the in vitro assays used in the initial genotoxicity screening battery (i.e., the battery of tests generally required by regulatory authorities) to assist in overall risk‐based decision making concerning the potential effects of human exposure to the agent under test. Over the past few years, the International Life Sciences Institute (ILSI) Health and Environmental Sciences Institute (HESI) Project Committee on the Relevance and Follow‐up of Positive Results in In Vitro Genetic Toxicity (IVGT) Testing developed a decision process flow chart to be applied in case of clear positive results in vitro. It provides for a variety of different possibilities and allows flexibility in choosing follow‐up action(s), depending on the results obtained in the initial battery of assays and available information. The intent of the Review Subgroup was not to provide a prescriptive testing strategy, but rather to reinforce the concept of weighing the totality of the evidence. The Review Subgroup of the IVGT committee highlighted the importance of properly analyzing the existing data, and considering potential confounding factors (e.g., possible interactions with the test systems, presence of impurities, irrelevant metabolism), and chemical modes of action when analyzing and interpreting positive results in the in vitro genotoxicity assays and determining appropriate follow‐up testing. The Review Subgroup also examined the characteristics, strengths, and limitations of each of the existing in vitro and in vivo genotoxicity assays to determine their usefulness in any follow‐up testing. Environ. Mol. Mutagen., 2011.


Nature Biotechnology | 2006

A framework for the use of genomics data at the EPA

David J. Dix; Kathryn Gallagher; William H. Benson; Brenda L Groskinsky; J. Thomas McClintock; Kerry L. Dearfield; William H Farland

The US Environmental Protection Agency is developing a new guidance that outlines best practice in the submission, quality assurance, analysis and management of genomics data for environmental applications.


Mutation Research-genetic Toxicology and Environmental Mutagenesis | 2015

Approaches for identifying germ cell mutagens: Report of the 2013 IWGT workshop on germ cell assays☆

Carole L. Yauk; Marilyn J. Aardema; Jan van Benthem; Jack B. Bishop; Kerry L. Dearfield; David M. DeMarini; Yuri E. Dubrova; Masamitsu Honma; James R. Lupski; Francesco Marchetti; Marvin L. Meistrich; Francesca Pacchierotti; Jane Stewart; Michael D. Waters; George R. Douglas

This workshop reviewed the current science to inform and recommend the best evidence-based approaches on the use of germ cell genotoxicity tests. The workshop questions and key outcomes were as follows. (1) Do genotoxicity and mutagenicity assays in somatic cells predict germ cell effects? Limited data suggest that somatic cell tests detect most germ cell mutagens, but there are strong concerns that dictate caution in drawing conclusions. (2) Should germ cell tests be done, and when? If there is evidence that a chemical or its metabolite(s) will not reach target germ cells or gonadal tissue, it is not necessary to conduct germ cell tests, notwithstanding somatic outcomes. However, it was recommended that negative somatic cell mutagens with clear evidence for gonadal exposure and evidence of toxicity in germ cells could be considered for germ cell mutagenicity testing. For somatic mutagens that are known to reach the gonadal compartments and expose germ cells, the chemical could be assumed to be a germ cell mutagen without further testing. Nevertheless, germ cell mutagenicity testing would be needed for quantitative risk assessment. (3) What new assays should be implemented and how? There is an immediate need for research on the application of whole genome sequencing in heritable mutation analysis in humans and animals, and integration of germ cell assays with somatic cell genotoxicity tests. Focus should be on environmental exposures that can cause de novo mutations, particularly newly recognized types of genomic changes. Mutational events, which may occur by exposure of germ cells during embryonic development, should also be investigated. Finally, where there are indications of germ cell toxicity in repeat dose or reproductive toxicology tests, consideration should be given to leveraging those studies to inform of possible germ cell genotoxicity.


Mutation Research-genetic Toxicology and Environmental Mutagenesis | 2011

Compilation and use of genetic toxicity historical control data.

Makoto Hayashi; Kerry L. Dearfield; Peter Kasper; David P. Lovell; Hans-Joerg Martus; Véronique Thybaud

The optimal use of historical control data for the interpretation of genotoxicity results was discussed at the 2009 International Workshop on Genotoxicity Testing (IWGT) in Basel, Switzerland. The historical control working group focused mainly on negative control data although positive control data were also considered to be important. Historical control data are typically used for comparison with the concurrent control data as part of the assay acceptance criteria. Historical control data are also important for providing evidence of the technical competence and familiarization of the assay at any given laboratory. Moreover, historical control data are increasingly being used to aid in the interpretation of genetic toxicity assay results. The objective of the working group was to provide generic advice for historical control data that could be applied to all assays rather than to give assay-specific recommendations. In brief, the recommendations include:


Environmental and Molecular Mutagenesis | 2008

An evaluation of the mode of action framework for mutagenic carcinogens case study: Cyclophosphamide

Nancy McCarroll; Nagalakshmi Keshava; Michael C. Cimino; Margaret Chu; Kerry L. Dearfield; Channa Keshava; Andrew D. Kligerman; Russell D. Owen; Alberto Protzel; Resha Putzrath; Rita Schoeny

In response to the 2005 revised US Environmental Protection Agency (EPA) Cancer Guidelines, a Risk Assessment Forums Technical Panel has devised a strategy in which genetic toxicology data combined with other information are assessed to determine whether a carcinogen operates through a mutagenic mode of action (MOA). This information is necessary for EPA to decide whether age‐dependent adjustment factors (ADAFs) should be applied to the cancer risk assessment. A decision tree has been developed as a part of this approach and outlines the critical steps for analyzing a compound for carcinogenicity through a mutagenic MOA (e.g., data analysis, determination of mutagenicity in animals and in humans). Agents, showing mutagenicity in animals and humans, proceed through the Agencys framework analysis for MOAs. Cyclophosphamide (CP), an antineoplastic agent, which is carcinogenic in animals and humans and mutagenic in vitro and in vivo, was selected as a case study to illustrate how the framework analysis would be applied to prove that a carcinogen operates through a mutagenic MOA. Consistent positive results have been seen for mutagenic activity in numerous in vitro assays, in animals (mice, rats, and hamsters) and in humans. Accordingly, CP was processed through the framework analysis and key steps leading to tumor formation were identified as follows: metabolism of the parent compound to alkylating metabolites, DNA damage followed by induction of multiple adverse genetic events, cell proliferation, and bladder tumors. Genetic changes in rats (sister chromatid exchanges at 0.62 mg/kg) can commence within 30 min and in cancer patients, chromosome aberrations at 35 mg/kg are seen by 1 hr, well within the timeframe and tumorigenic dose range for early events. Supporting evidence is also found for cell proliferation, indicating that mutagenicity, associated with cytotoxicity, leads to a proliferative response, which occurs early (48 hr) in the process of tumor induction. Overall, the weight of evidence evaluation supports CP acting through a mutagenic MOA. In addition, no data were found that an alternative MOA might be operative. Therefore, the cancer guidelines recommend a linear extrapolation for the risk assessment. Additionally, data exist showing that CP induces mutagenicity in fetal blood and in the peripheral blood of pediatric patients; thus, the ADAFs would be applied. Environ. Mol. Mutagen., 2008. Published 2008 Wiley‐Liss, Inc.


Mutation Research-genetic Toxicology and Environmental Mutagenesis | 2011

Strategies in case of positive in vivo results in genotoxicity testing

Véronique Thybaud; James T. MacGregor; Lutz Müller; Riccardo Crebelli; Kerry L. Dearfield; George R. Douglas; Peter B. Farmer; Elmar Gocke; Makoto Hayashi; David P. Lovell; Werner K. Lutz; Daniel Marzin; Martha M. Moore; Takehiko Nohmi; David H. Phillips; Jan van Benthem

At the 2009 International Workshop on Genotoxicity Testing in Basel, an expert group gathered to provide guidance on suitable follow-up tests to describe risk when basic in vivo genotoxicity tests have yielded positive results. The working group agreed that non-linear dose-response curves occur in vivo with at least some DNA-reactive agents. Quantitative risk assessment in such cases requires the use of (1) adequate data, i.e., the use of all available data for the selection of reliable in vivo models to be used for quantitative risk assessment, (2) appropriate mathematical models and statistical analysis for characterizing the dose-response relationships and allowing the use of quantitative and dose-response information in the interpretation of results, (3) mode of action (MOA) information for the evaluation and analysis of risk, and (4) reliable assessments of the internal dose across species for deriving acceptable margins of exposure and risk levels. Hence, the elucidation of MOA and understanding of the mechanism underlying the dose-response curve are important components of risk assessment. The group agreed on the need for (i) the development of in vivo assays, especially multi-endpoint, multi-species assays, with emphasis on those applicable to humans, and (ii) consensus about the most appropriate mathematical models and statistical analyses for defining non-linear dose-responses and exposure levels associated with acceptable risk.


Environmental and Molecular Mutagenesis | 2017

Next generation testing strategy for assessment of genomic damage: A conceptual framework and considerations

Kerry L. Dearfield; B. Bhaskar Gollapudi; Jeffrey C. Bemis; R. Daniel Benz; George R. Douglas; Rosalie K. Elespuru; George E. Johnson; David Kirkland; Matthew J. LeBaron; Albert P. Li; Francesco Marchetti; Lynn H. Pottenger; Emiel Rorije; Jennifer Y. Tanir; Véronique Thybaud; Jan van Benthem; Carole L. Yauk; Errol Zeiger; Mirjam Luijten

For several decades, regulatory testing schemes for genetic damage have been standardized where the tests being utilized examined mutations and structural and numerical chromosomal damage. This has served the genetic toxicity community well when most of the substances being tested were amenable to such assays. The outcome from this testing is usually a dichotomous (yes/no) evaluation of test results, and in many instances, the information is only used to determine whether a substance has carcinogenic potential or not. Over the same time period, mechanisms and modes of action (MOAs) that elucidate a wider range of genomic damage involved in many adverse health outcomes have been recognized. In addition, a paradigm shift in applied genetic toxicology is moving the field toward a more quantitative dose‐response analysis and point‐of‐departure (PoD) determination with a focus on risks to exposed humans. This is directing emphasis on genomic damage that is likely to induce changes associated with a variety of adverse health outcomes. This paradigm shift is moving the testing emphasis for genetic damage from a hazard identification only evaluation to a more comprehensive risk assessment approach that provides more insightful information for decision makers regarding the potential risk of genetic damage to exposed humans. To enable this broader context for examining genetic damage, a next generation testing strategy needs to take into account a broader, more flexible approach to testing, and ultimately modeling, of genomic damage as it relates to human exposure. This is consistent with the larger risk assessment context being used in regulatory decision making. As presented here, this flexible approach for examining genomic damage focuses on testing for relevant genomic effects that can be, as best as possible, associated with an adverse health effect. The most desired linkage for risk to humans would be changes in loci associated with human diseases, whether in somatic or germ cells. The outline of a flexible approach and associated considerations are presented in a series of nine steps, some of which can occur in parallel, which was developed through a collaborative effort by leading genetic toxicologists from academia, government, and industry through the International Life Sciences Institute (ILSI) Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC). The ultimate goal is to provide quantitative data to model the potential risk levels of substances, which induce genomic damage contributing to human adverse health outcomes. Any good risk assessment begins with asking the appropriate risk management questions in a planning and scoping effort. This step sets up the problem to be addressed (e.g., broadly, does genomic damage need to be addressed, and if so, how to proceed). The next two steps assemble what is known about the problem by building a knowledge base about the substance of concern and developing a rational biological argument for why testing for genomic damage is needed or not. By focusing on the risk management problem and potential genomic damage of concern, the next step of assay(s) selection takes place. The work‐up of the problem during the earlier steps provides the insight to which assays would most likely produce the most meaningful data. This discussion does not detail the wide range of genomic damage tests available, but points to types of testing systems that can be very useful. Once the assays are performed and analyzed, the relevant data sets are selected for modeling potential risk. From this point on, the data are evaluated and modeled as they are for any other toxicology endpoint. Any observed genomic damage/effects (or genetic event(s)) can be modeled via a dose‐response analysis and determination of an estimated PoD. When a quantitative risk analysis is needed for decision making, a parallel exposure assessment effort is performed (exposure assessment is not detailed here as this is not the focus of this discussion; guidelines for this assessment exist elsewhere). Then the PoD for genomic damage is used with the exposure information to develop risk estimations (e.g., using reference dose (RfD), margin of exposure (MOE) approaches) in a risk characterization and presented to risk managers for informing decision making. This approach is applicable now for incorporating genomic damage results into the decision‐making process for assessing potential adverse outcomes in chemically exposed humans and is consistent with the ILSI HESI Risk Assessment in the 21st Century (RISK21) roadmap. This applies to any substance to which humans are exposed, including pharmaceuticals, agricultural products, food additives, and other chemicals. It is time for regulatory bodies to incorporate the broader knowledge and insights provided by genomic damage results into the assessments of risk to more fully understand the potential of adverse outcomes in chemically exposed humans, thus improving the assessment of risk due to genomic damage. The historical use of genomic damage data as a yes/no gateway for possible cancer risk has been too narrowly focused in risk assessment. The recent advances in assaying for and understanding genomic damage, including eventually epigenetic alterations, obviously add a greater wealth of information for determining potential risk to humans. Regulatory bodies need to embrace this paradigm shift from hazard identification to quantitative analysis and to incorporate the wider range of genomic damage in their assessments of risk to humans. The quantitative analyses and methodologies discussed here can be readily applied to genomic damage testing results now. Indeed, with the passage of the recent update to the Toxic Substances Control Act (TSCA) in the US, the new generation testing strategy for genomic damage described here provides a regulatory agency (here the US Environmental Protection Agency (EPA), but suitable for others) a golden opportunity to reexamine the way it addresses risk‐based genomic damage testing (including hazard identification and exposure). Environ. Mol. Mutagen. 58:264–283, 2017.


International Journal of Food Microbiology | 2013

Characterizing uncertainty when evaluating risk management metrics: Risk assessment modeling of Listeria monocytogenes contamination in ready-to-eat deli meats

Daniel L. Gallagher; Eric D. Ebel; Owen Gallagher; David LaBARRE; Michael S. Williams; Neal J. Golden; Régis Pouillot; Kerry L. Dearfield; Janell Kause

This report illustrates how the uncertainty about food safety metrics may influence the selection of a performance objective (PO). To accomplish this goal, we developed a model concerning Listeria monocytogenes in ready-to-eat (RTE) deli meats. This application used a second order Monte Carlo model that simulates L. monocytogenes concentrations through a series of steps: the food-processing establishment, transport, retail, the consumers home and consumption. The model accounted for growth inhibitor use, retail cross contamination, and applied an FAO/WHO dose response model for evaluating the probability of illness. An appropriate level of protection (ALOP) risk metric was selected as the average risk of illness per serving across all consumed servings-per-annum and the model was used to solve for the corresponding performance objective (PO) risk metric as the maximum allowable L. monocytogenes concentration (cfu/g) at the processing establishment where regulatory monitoring would occur. Given uncertainty about model inputs, an uncertainty distribution of the PO was estimated. Additionally, we considered how RTE deli meats contaminated at levels above the PO would be handled by the industry using three alternative approaches. Points on the PO distribution represent the probability that - if the industry complies with a particular PO - the resulting risk-per-serving is less than or equal to the target ALOP. For example, assuming (1) a target ALOP of -6.41 log10 risk of illness per serving, (2) industry concentrations above the PO that are re-distributed throughout the remaining concentration distribution and (3) no dose response uncertainty, establishment POs of -4.98 and -4.39 log10 cfu/g would be required for 90% and 75% confidence that the target ALOP is met, respectively. The PO concentrations from this example scenario are more stringent than the current typical monitoring level of an absence in 25 g (i.e., -1.40 log10 cfu/g) or a stricter criteria of absence in 125 g (i.e., -2.1 log10 cfu/g). This example, and others, demonstrates that a PO for L. monocytogenes would be far below any current monitoring capabilities. Furthermore, this work highlights the demands placed on risk managers and risk assessors when applying uncertain risk models to the current risk metric framework.

Collaboration


Dive into the Kerry L. Dearfield's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Errol Zeiger

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Martha M. Moore

National Center for Toxicological Research

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jan van Benthem

Centre for Health Protection

View shared research outputs
Top Co-Authors

Avatar

Michael C. Cimino

United States Environmental Protection Agency

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge