Lynn H. Pottenger
Dow Chemical Company
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lynn H. Pottenger.
Environmental and Molecular Mutagenesis | 2013
B. Bhaskar Gollapudi; George E. Johnson; Lya G. Hernández; Lynn H. Pottenger; Kerry L. Dearfield; Alan M. Jeffrey; E. Julien; James H. Kim; David P. Lovell; James T. MacGregor; Martha M. Moore; J. van Benthem; Paul A. White; Errol Zeiger; Véronique Thybaud
Genetic toxicology studies are required for the safety assessment of chemicals. Data from these studies have historically been interpreted in a qualitative, dichotomous “yes” or “no” manner without analysis of dose–response relationships. This article is based upon the work of an international multi‐sector group that examined how quantitative dose–response relationships for in vitro and in vivo genetic toxicology data might be used to improve human risk assessment. The group examined three quantitative approaches for analyzing dose–response curves and deriving point‐of‐departure (POD) metrics (i.e., the no‐observed‐genotoxic‐effect‐level (NOGEL), the threshold effect level (Td), and the benchmark dose (BMD)), using data for the induction of micronuclei and gene mutations by methyl methanesulfonate or ethyl methanesulfonate in vitro and in vivo. These results suggest that the POD descriptors obtained using the different approaches are within the same order of magnitude, with more variability observed for the in vivo assays. The different approaches were found to be complementary as each has advantages and limitations. The results further indicate that the lower confidence limit of a benchmark response rate of 10% (BMDL10) could be considered a satisfactory POD when analyzing genotoxicity data using the BMD approach. The models described permit the identification of POD values that could be combined with mode of action analysis to determine whether exposure(s) below a particular level constitutes a significant human risk. Subsequent analyses will expand the number of substances and endpoints investigated, and continue to evaluate the utility of quantitative approaches for analysis of genetic toxicity dose–response data. Environ. Mol. Mutagen., 2013.
Environmental and Molecular Mutagenesis | 2014
George E. Johnson; Lya G. Soeteman-Hernández; B. Bhaskar Gollapudi; Owen Bodger; Kerry L. Dearfield; Robert H. Heflich; J.G. Hixon; David P. Lovell; James T. MacGregor; Lynn H. Pottenger; C.M. Thompson; L. Abraham; Véronique Thybaud; Jennifer Y. Tanir; Errol Zeiger; J. van Benthem; Paul A. White
Genetic toxicology data have traditionally been employed for qualitative, rather than quantitative evaluations of hazard. As a continuation of our earlier report that analyzed ethyl methanesulfonate (EMS) and methyl methanesulfonate (MMS) dose–response data (Gollapudi et al., 2013), here we present analyses of 1‐ethyl‐1‐nitrosourea (ENU) and 1‐methyl‐1‐nitrosourea (MNU) dose–response data and additional approaches for the determination of genetic toxicity point‐of‐departure (PoD) metrics. We previously described methods to determine the no‐observed‐genotoxic‐effect‐level (NOGEL), the breakpoint‐dose (BPD; previously named Td), and the benchmark dose (BMD10) for genetic toxicity endpoints. In this study we employed those methods, along with a new approach, to determine the non‐linear slope‐transition‐dose (STD), and alternative methods to determine the BPD and BMD, for the analyses of nine ENU and 22 MNU datasets across a range of in vitro and in vivo endpoints. The NOGEL, BMDL10 and BMDL1SD PoD metrics could be readily calculated for most gene mutation and chromosomal damage studies; however, BPDs and STDs could not always be derived due to data limitations and constraints of the underlying statistical methods. The BMDL10 values were often lower than the other PoDs, and the distribution of BMDL10 values produced the lowest median PoD. Our observations indicate that, among the methods investigated in this study, the BMD approach is the preferred PoD for quantitatively describing genetic toxicology data. Once genetic toxicology PoDs are calculated via this approach, they can be used to derive reference doses and margin of exposure values that may be useful for evaluating human risk and regulatory decision making. Environ. Mol. Mutagen. 55:609–623, 2014.
Critical Reviews in Toxicology | 2009
Annie M. Jarabek; Lynn H. Pottenger; Larry S. Andrews; Daniel A. Casciano; Michelle R. Embry; James H. Kim; R. Julian Preston; M. Vijayaraj Reddy; Rita Schoeny; David E. G. Shuker; Julie A. Skare; James A. Swenberg; Gary M. Williams; Errol Zeiger
The assessment of human cancer risk from chemical exposure requires the integration of diverse types of data. Such data involve effects at the cell and tissue levels. This report focuses on the specific utility of one type of data, namely DNA adducts. Emphasis is placed on the appreciation that such DNA adduct data cannot be used in isolation in the risk assessment process but must be used in an integrated fashion with other information. As emerging technologies provide even more sensitive quantitative measurements of DNA adducts, integration that establishes links between DNA adducts and accepted outcome measures becomes critical for risk assessment. The present report proposes an organizational approach for the assessment of DNA adduct data (e.g., type of adduct, frequency, persistence, type of repair process) in concert with other relevant data, such as dosimetry, toxicity, mutagenicity, genotoxicity, and tumor incidence, to inform characterization of the mode of action. DNA adducts are considered biomarkers of exposure, whereas gene mutations and chromosomal alterations are often biomarkers of early biological effects and also can be bioindicators of the carcinogenic process.
Critical Reviews in Toxicology | 2013
Lorenz R. Rhomberg; Julie E. Goodman; Lisa A. Bailey; Robyn L. Prueitt; Nancy B. Beck; Christopher Bevan; Michael Honeycutt; Norbert E. Kaminski; Greg Paoli; Lynn H. Pottenger; Roberta W. Scherer; Kimberly Wise; Richard A. Becker
Abstract The National Academy of Sciences (NAS) Review of the Environmental Protection Agency’s Draft IRIS Assessment of Formaldehyde proposed a “roadmap” for reform and improvement of the Agency’s risk assessment process. Specifically, it called for development of a transparent and defensible methodology for weight-of-evidence (WoE) assessments. To facilitate development of an improved process, we developed a white paper that reviewed approximately 50 existing WoE frameworks, seeking insights from their variations and nominating best practices for WoE analyses of causation of chemical risks. Four phases of WoE analysis were identified and evaluated in each framework: (1) defining the causal question and developing criteria for study selection, (2) developing and applying criteria for review of individual studies, (3) evaluating and integrating evidence and (4) drawing conclusions based on inferences. We circulated the draft white paper to stakeholders and then held a facilitated, multi-disciplinary invited stakeholder workshop to broaden and deepen the discussion on methods, rationales, utility and limitations among the surveyed WoE frameworks. The workshop developed recommendations for improving the conduct of WoE evaluations. Based on the analysis of the 50 frameworks and discussions at the workshop, best practices in conducting WoE analyses were identified for each of the four phases. Many of these best practices noted from the analysis and workshop could be implemented immediately, while others may require additional refinement as part of the ongoing discussions for improving the scientific basis of chemical risk assessments.
Mutation Research-genetic Toxicology and Environmental Mutagenesis | 2009
Lynn H. Pottenger; Melissa R. Schisler; Fagen Zhang; Michael J. Bartels; Donald D. Fontaine; Lisa G. McFadden; B. Bhaskar Gollapudi
The dose-response relationships for in vitro mutagenicity induced by methylmethanesulfonate (MMS) or methylnitrosourea (MNU) in L5178Y mouse lymphoma (ML) cells were examined. DNA adducts (N7-methylguanine, N7MeG and O(6)-methylguanine, O(6)MeG) were quantified as biomarkers of exposure. Both endpoints were assessed using 5replicates/dose (4-h treatment) with MMS or MNU (0.0069-50muM), or vehicle (1% DMSO). Mutant frequency (MF) (thymidine kinase (TK) locus) was determined using the soft agar cloning methodology and a 2-day expression period; in addition, microwell and Sequester-Express-Select (SES) methods were used for MMS. Isolated DNA was acid-hydrolyzed, and adducts quantified by LC/ESI-MS/MS, using authentic and internal standards. MF dose-responses were analyzed using several statistical approaches, all of which confirmed that a threshold dose-response model provided the best fit. NOAELs for MF were 10muM MMS and 0.69muM MNU, based on ANOVA and Dunnetts test (p<0.05). N7MeG adducts were present in all cell samples, including solvent-control cells, and were increased over control levels in cells treated with >/=10muM MMS or 3.45muM MNU. O(6)MeG levels were only quantifiable at >/=10muM MNU; O(6)MeG was not quantifiable in control or MMS-treated cells at current detection limits. Thus, (1) cells treated with </=0.69muM MNU or </=10muM MMS did not demonstrate increases in TK(-) MF, but did demonstrate quantifiable levels of N7MeG adducts; and (2) the levels of N7MeG adducts did not correlate with induced MF, as MNU-treated cells had fewer N7MeG adducts but higher MF compared with MMS-treated cells, for quasi-equimolar doses. Taken together, these results demonstrate operational thresholds, defined as the highest dose for which the response is not significantly (statistically or biologically) distinguishable from the control/background values, for induction of mutations and N7MeG adducts in ML cells treated with MMS or MNU, and a lack of correlation between induced MF and levels of N7MeG adducts.
Critical Reviews in Toxicology | 2013
Jeff R. Fowles; Marcy I. Banton; Lynn H. Pottenger
Abstract The toxicological profiles of monopropylene glycol (MPG), dipropylene glycol (DPG), tripropylene glycol (TPG) and polypropylene glycols (PPG; including tetra-rich oligomers) are collectively reviewed, and assessed considering regulatory toxicology endpoints. The review confirms a rich data set for these compounds, covering all of the major toxicological endpoints of interest. The metabolism of these compounds share common pathways, and a consistent profile of toxicity is observed. The common metabolism provides scientific justification for adopting a read-across approach to describing expected hazard potential from data gaps that may exist for specific oligomers. None of the glycols reviewed presented evidence of carcinogenic, mutagenic or reproductive/developmental toxicity potential to humans. The pathologies reported in some animal studies either occurred at doses that exceeded experimental guidelines, or involved mechanisms that are likely irrelevant to human physiology and therefore are not pertinent to the exposures experienced by consumers or workers. At very high chronic doses, MPG causes a transient, slight decrease in hemoglobin in dogs and at somewhat lower doses causes Heinz bodies to form in cats in the absence of any clinical signs of anemia. Some evidence for rare, idiosyncratic skin reactions exists for MPG. However, the larger data set indicates that these compounds have low sensitization potential in animal studies, and therefore are unlikely to represent human allergens. The existing safety evaluations of the FDA, USEPA, NTP and ATSDR for these compounds are consistent and point to the conclusion that the propylene glycols present a very low risk to human health.
Critical Reviews in Toxicology | 2013
Michael Dourson; Richard A. Becker; Lynne T. Haber; Lynn H. Pottenger; Tiffany Bredfeldt; Penelope A. Fenner-Crisp
Abstract Over the last dozen years, many national and international expert groups have considered specific improvements to risk assessment. Many of their stated recommendations are mutually supportive, but others appear conflicting, at least in an initial assessment. This review identifies areas of consensus and difference and recommends a practical, biology-centric course forward, which includes: (1) incorporating a clear problem formulation at the outset of the assessment with a level of complexity that is appropriate for informing the relevant risk management decision; (2) using toxicokinetics and toxicodynamic information to develop Chemical Specific Adjustment Factors (CSAF); (3) using mode of action (MOA) information and an understanding of the relevant biology as the key, central organizing principle for the risk assessment; (4) integrating MOA information into dose–response assessments using existing guidelines for non-cancer and cancer assessments; (5) using a tiered, iterative approach developed by the World Health Organization/International Programme on Chemical Safety (WHO/IPCS) as a scientifically robust, fit-for-purpose approach for risk assessment of combined exposures (chemical mixtures); and (6) applying all of this knowledge to enable interpretation of human biomonitoring data in a risk context. While scientifically based defaults will remain important and useful when data on CSAF or MOA to refine an assessment are absent or insufficient, assessments should always strive to use these data. The use of available 21st century knowledge of biological processes, clinical findings, chemical interactions, and dose–response at the molecular, cellular, organ and organism levels will minimize the need for extrapolation and reliance on default approaches.
Environmental and Molecular Mutagenesis | 2017
Kerry L. Dearfield; B. Bhaskar Gollapudi; Jeffrey C. Bemis; R. Daniel Benz; George R. Douglas; Rosalie K. Elespuru; George E. Johnson; David Kirkland; Matthew J. LeBaron; Albert P. Li; Francesco Marchetti; Lynn H. Pottenger; Emiel Rorije; Jennifer Y. Tanir; Véronique Thybaud; Jan van Benthem; Carole L. Yauk; Errol Zeiger; Mirjam Luijten
For several decades, regulatory testing schemes for genetic damage have been standardized where the tests being utilized examined mutations and structural and numerical chromosomal damage. This has served the genetic toxicity community well when most of the substances being tested were amenable to such assays. The outcome from this testing is usually a dichotomous (yes/no) evaluation of test results, and in many instances, the information is only used to determine whether a substance has carcinogenic potential or not. Over the same time period, mechanisms and modes of action (MOAs) that elucidate a wider range of genomic damage involved in many adverse health outcomes have been recognized. In addition, a paradigm shift in applied genetic toxicology is moving the field toward a more quantitative dose‐response analysis and point‐of‐departure (PoD) determination with a focus on risks to exposed humans. This is directing emphasis on genomic damage that is likely to induce changes associated with a variety of adverse health outcomes. This paradigm shift is moving the testing emphasis for genetic damage from a hazard identification only evaluation to a more comprehensive risk assessment approach that provides more insightful information for decision makers regarding the potential risk of genetic damage to exposed humans. To enable this broader context for examining genetic damage, a next generation testing strategy needs to take into account a broader, more flexible approach to testing, and ultimately modeling, of genomic damage as it relates to human exposure. This is consistent with the larger risk assessment context being used in regulatory decision making. As presented here, this flexible approach for examining genomic damage focuses on testing for relevant genomic effects that can be, as best as possible, associated with an adverse health effect. The most desired linkage for risk to humans would be changes in loci associated with human diseases, whether in somatic or germ cells. The outline of a flexible approach and associated considerations are presented in a series of nine steps, some of which can occur in parallel, which was developed through a collaborative effort by leading genetic toxicologists from academia, government, and industry through the International Life Sciences Institute (ILSI) Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC). The ultimate goal is to provide quantitative data to model the potential risk levels of substances, which induce genomic damage contributing to human adverse health outcomes. Any good risk assessment begins with asking the appropriate risk management questions in a planning and scoping effort. This step sets up the problem to be addressed (e.g., broadly, does genomic damage need to be addressed, and if so, how to proceed). The next two steps assemble what is known about the problem by building a knowledge base about the substance of concern and developing a rational biological argument for why testing for genomic damage is needed or not. By focusing on the risk management problem and potential genomic damage of concern, the next step of assay(s) selection takes place. The work‐up of the problem during the earlier steps provides the insight to which assays would most likely produce the most meaningful data. This discussion does not detail the wide range of genomic damage tests available, but points to types of testing systems that can be very useful. Once the assays are performed and analyzed, the relevant data sets are selected for modeling potential risk. From this point on, the data are evaluated and modeled as they are for any other toxicology endpoint. Any observed genomic damage/effects (or genetic event(s)) can be modeled via a dose‐response analysis and determination of an estimated PoD. When a quantitative risk analysis is needed for decision making, a parallel exposure assessment effort is performed (exposure assessment is not detailed here as this is not the focus of this discussion; guidelines for this assessment exist elsewhere). Then the PoD for genomic damage is used with the exposure information to develop risk estimations (e.g., using reference dose (RfD), margin of exposure (MOE) approaches) in a risk characterization and presented to risk managers for informing decision making. This approach is applicable now for incorporating genomic damage results into the decision‐making process for assessing potential adverse outcomes in chemically exposed humans and is consistent with the ILSI HESI Risk Assessment in the 21st Century (RISK21) roadmap. This applies to any substance to which humans are exposed, including pharmaceuticals, agricultural products, food additives, and other chemicals. It is time for regulatory bodies to incorporate the broader knowledge and insights provided by genomic damage results into the assessments of risk to more fully understand the potential of adverse outcomes in chemically exposed humans, thus improving the assessment of risk due to genomic damage. The historical use of genomic damage data as a yes/no gateway for possible cancer risk has been too narrowly focused in risk assessment. The recent advances in assaying for and understanding genomic damage, including eventually epigenetic alterations, obviously add a greater wealth of information for determining potential risk to humans. Regulatory bodies need to embrace this paradigm shift from hazard identification to quantitative analysis and to incorporate the wider range of genomic damage in their assessments of risk to humans. The quantitative analyses and methodologies discussed here can be readily applied to genomic damage testing results now. Indeed, with the passage of the recent update to the Toxic Substances Control Act (TSCA) in the US, the new generation testing strategy for genomic damage described here provides a regulatory agency (here the US Environmental Protection Agency (EPA), but suitable for others) a golden opportunity to reexamine the way it addresses risk‐based genomic damage testing (including hazard identification and exposure). Environ. Mol. Mutagen. 58:264–283, 2017.
Environmental and Molecular Mutagenesis | 2010
Lynn H. Pottenger; B. Bhaskar Gollapudi
For more than 40+ years, genotoxicity data have been interpreted in a qualitative, binary mode; a chemical is considered either positive or negative for a response in the test system. Although dose–response information is sometimes used in this decision, it is not routine to obtain the amount of information needed to inform risk assessment, for example to determine no‐observed‐genotoxic‐effect‐levels, primarily due to the historical view of genotoxic responses as “linear, no‐threshold.” Only recently have researchers begun to address this issue through robust experimental designs and application of statistical models. A growing body‐of‐evidence supports the existence of response thresholds for a number of mutagenic agents, in vitro and in vivo. Clearly, simple observation of a “hockey‐stick” dose–response curve is not sufficient to establish a threshold. Collection of robust empirical data must be supported with an analysis of biological plausibility for the observed threshold. In this context, a chemical‐specific mode‐of‐action (MOA) approach, which identifies key events responsible for the observed mutagenic effect, is extremely valuable. Biomarkers of key events, providing qualitative and quantitative information, can be integrated in a weight‐of‐evidence‐based assessment of genotoxicity data from multiple test systems and used to identify data gaps to resolve/reduce uncertainties during the risk assessment process. To this end, specific recommendations on study design and data analysis are proposed. As the Environmental Mutagen Society celebrates its 40th anniversary, the field of genetic toxicology is marking a milestone on the path to a new paradigm, using a MOA, data‐driven approach to answer questions about thresholds for genotoxic agents. Environ. Mol. Mutagen., 2010.
Critical Reviews in Toxicology | 2014
Lynn H. Pottenger; Larry S. Andrews; Ammie N. Bachman; Peter J. Boogaard; Jean Cadet; Michelle R. Embry; Peter B. Farmer; Matthew W. Himmelstein; Annie M. Jarabek; Elizabeth A. Martin; Robert J. Mauthe; Rudranath Persaud; R. Julian Preston; Rita Schoeny; Julie A. Skare; James A. Swenberg; Gary M. Williams; Errol Zeiger; Fagen Zhang; James H. Kim
Abstract The framework analysis previously presented for using DNA adduct information in the risk assessment of chemical carcinogens was applied in a series of case studies which place the adduct information into context with the key events in carcinogenesis to determine whether they could be used to support a mutagenic mode of action (MOA) for the examined chemicals. Three data-rich chemicals, aflatoxin B1 (AFB1), tamoxifen (Tam) and vinyl chloride (VCl) were selected for this exercise. These chemicals were selected because they are known human carcinogens and have different characteristics: AFB1 forms a unique adduct and human exposure is through contaminated foods; Tam is a pharmaceutical given to women so that the dose and duration of exposure are known, forms unique adducts in rodents, and has both estrogenic and genotoxic properties; and VCl, to which there is industrial exposure, forms a number of adducts that are identical to endogenous adducts found in unexposed people. All three chemicals produce liver tumors in rats. AFB1 and VCl also produce liver tumors in humans, but Tam induces human uterine tumors, only. To support a mutagenic MOA, the chemical-induced adducts must be characterized, shown to be pro-mutagenic, be present in the tumor target tissue, and produce mutations of the class found in the tumor. The adducts formed by AFB1 and VCl support a mutagenic MOA for their carcinogenicity. However, the data available for Tam shows a mutagenic MOA for liver tumors in rats, but its carcinogenicity in humans is most likely via a different MOA.