Tjalling Jager
VU University Amsterdam
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tjalling Jager.
Ecotoxicology and Environmental Safety | 2003
Willie J.G.M. Peijnenburg; Tjalling Jager
Bioavailability and bioaccessibility are complex issues that determine whether or not adverse effects are to be expected when organisms or plants are exposed to contaminants. Clearly, the determinants of bioavailability and bioaccessibility must be understood if one is to monitor or, ultimately, predict the effects of metals. On the basis of a dynamic conceptual model, this article offers an analysis of the physicochemical and biological determinants underlying bioavailability and bioaccessibility. This analysis is used as the basis for a general monitoring strategy for assessing potentially and actually available and accessible metal fractions in the environmental matrices of water, soil, and sediment. We conclude that, lack of a universal expression of bioavailable and bioaccessible metal fractions precludes the presentation of a detailed monitoring strategy that is broadly applicable. Instead, we recommend that a critical assessment of the endpoints of determination become the basis for a need-specific monitoring strategy.
Chemosphere | 2000
Mark A. J. Huijbregts; U. Thissen; Jeroen B. Guinée; Tjalling Jager; D. Kalf; D. van de Meent; A.M.J. Ragas; A. Wegener Sleeswijk; Lucas Reijnders
Toxicity potentials are standard values used in life cycle assessment (LCA) to enable a comparison of toxic impacts between substances. In most cases, toxicity potentials are calculated with multi-media fate models. Until now, unrealistic system settings were used for these calculations. The present paper outlines an improved model to calculate toxicity potentials: the global nested multi-media fate, exposure and effects model USES-LCA. It is based on the Uniform System for the Evaluation of Substances 2.0 (USES 2.0). USES-LCA was used to calculate for 181 substances toxicity potentials for the six impact categories freshwater aquatic ecotoxicity, marine aquatic ecotoxicity, freshwater sediment ecotoxicity, marine sediment ecotoxicity, terrestrial ecotoxicity and human toxicity, after initial emission to the compartments air, freshwater, seawater, industrial soil and agricultural soil, respectively. Differences of several orders of magnitude were found between the new toxicity potentials and those calculated previously.
Environmental Science & Technology | 2011
Tjalling Jager; Carlo Albert; Thomas G. Preuss; Roman Ashauer
Toxicokinetic-toxicodynamic models (TKTD models) simulate the time-course of processes leading to toxic effects on organisms. Even for an apparently simple endpoint as survival, a large number of very different TKTD approaches exist. These differ in their underlying hypotheses and assumptions, although often the assumptions are not explicitly stated. Thus, our first objective was to illuminate the underlying assumptions (individual tolerance or stochastic death, speed of toxicodynamic damage recovery, threshold distribution) of various existing modeling approaches for survival and show how they relate to each other (e.g., critical body residue, critical target occupation, damage assessment, DEBtox survival, threshold damage). Our second objective was to develop a general unified threshold model for survival (GUTS), from which a large range of existing models can be derived as special cases. Specific assumptions to arrive at these special cases are made and explained. Finally, we illustrate how special cases of GUTS can be fitted to survival data. We envision that GUTS will help increase the application of TKTD models in ecotoxicological research as well as environmental risk assessment of chemicals. It unifies a wide range of previously unrelated approaches, clarifies their underlying assumptions, and facilitates further improvement in the modeling of survival under chemical stress.
Ecotoxicology | 2010
Tjalling Jager; Tine Vandenbrouck; Jan Baas; Wim De Coen; S.A.L.M. Kooijman
Typical approaches for analyzing mixture ecotoxicity data only provide a description of the data; they cannot explain observed interactions, nor explain why mixture effects can change in time and differ between endpoints. To improve our understanding of mixture toxicity we need to explore biology-based models. In this paper, we present an integrated approach to deal with the toxic effects of mixtures on growth, reproduction and survival, over the life cycle. Toxicokinetics is addressed with a one-compartment model, accounting for effects of growth. Each component of the mixture has its own toxicokinetics model, but all compounds share the effect of body size on uptake kinetics. The toxicodynamic component of the method is formed by an implementation of dynamic energy budget theory; a set of simple rules for metabolic organization that ensures conservation of mass and energy. Toxicant effects are treated as a disruption of regular metabolic processes such as an increase in maintenance costs. The various metabolic processes interact, which means that mixtures of compounds with certain mechanisms of action have to produce a response surface that deviates from standard models (such as ‘concentration addition’). Only by separating these physiological interactions from the chemical interactions between mixture components can we hope to achieve generality and a better understanding of mixture effects. For example, a biology-based approach allows for educated extrapolations to other mixtures, other species, and other exposure situations. We illustrate our method with the interpretation of partial life-cycle data for two polycyclic aromatic hydrocarbons in Daphnia magna.
Chemosphere | 2000
Mark A. J. Huijbregts; U. Thissen; Tjalling Jager; D. van de Meent; A.M.J. Ragas
Toxicity potentials are standard values used in life cycle assessment (LCA) to enable a comparison of toxic impacts between substances. This paper presents the results of an uncertainty assessment of toxicity potentials that were calculated with the global nested multi-media fate, exposure and effects model USES-LCA. The variance in toxicity potentials resulting from input parameter uncertainties and human variability was quantified by means of Monte Carlo analysis with Latin Hypercube sampling (LHS). For Atrazine, 2,3,7,8-TCDD and Lead, variation, expressed by the ratio of the 97.5%-ile and the 2.5%-ile, ranges from about 1.5 to 6 orders of magnitude. The major part of this variation originates from a limited set of substance-specific input parameters, i.e. parameters that describe transport mechanisms, substance degradation, indirect exposure routes and no-effect concentrations. Considerable correlations were found between the toxicity potentials of one substance, in particular within one impact category. The uncertainties and correlations reported in the present study may have a significant impact on the outcome of LCA case studies.
Human and Ecological Risk Assessment | 2011
Valery E. Forbes; P. Calow; Volker Grimm; Takehiko I. Hayashi; Tjalling Jager; Agnete Krabbe Katholm; Annemette Palmqvist; Rob Pastorok; Dan Salvito; Richard M. Sibly; Julann Spromberg; John D. Stark; Richard A. Stillman
ABSTRACT Current measures used to estimate the risks of toxic chemicals are not relevant to the goals of the environmental protection process, and thus ecological risk assessment (ERA) is not used as extensively as it should be as a basis for cost-effective management of environmental resources. Appropriate population models can provide a powerful basis for expressing ecological risks that better inform the environmental management process and thus that are more likely to be used by managers. Here we provide at least five reasons why population modeling should play an important role in bridging the gap between what we measure and what we want to protect. We then describe six actions needed for its implementation into management-relevant ERA.
Science of The Total Environment | 2010
Jan Baas; Tjalling Jager; Bas Kooijman
Studies in ecotoxicology usually focus on a single end point (typically mortality, growth, or reproduction) at a standardized exposure time. The exposure time is chosen irrespective of the properties of the chemical under scrutiny, but should depend on the organism of choice in combination with the compound(s) of interest. This paper discusses the typical patterns for toxic effects in time that can be observed for the most encountered endpoints growth reproduction and survival. Ignoring the fact that toxicity is a process in time can lead to severe bias in environmental risk assessment. We show that especially EC(x) values for sublethal endpoints can show very distinct patterns in time. We recommend that the test duration for survival as an endpoint should be extended till the incipient LC(50) is observed. Given the fact that toxicity data for single compounds show clear patterns in time, it is to be expected that effects of mixtures will also be strongly dependent on time. The few examples that have been published support this statement.
Environmental Toxicology and Chemistry | 2012
Tjalling Jager
Ecotoxicologists generally rely on simple measures to express the toxicity of chemicals to organisms, such as the no-observed-effect concentration (NOEC) and the exposure concentration associated with x% effect (ECx). The origin of these toxicity metrics probably relates to the real or presumed needs of risk assessment. Regulators prefer simple and well-accepted tools to help them manage the huge number of chemicals released into the environment. Ecotoxicology, however, does not exist for the sole purpose of supporting regulators; it should also be a field of scientific research. Unfortunately, science and regulation are so tightly intertwined in ecotoxicology that they are often difficult to separate. The ‘‘bad habits’’ ecotoxicological pioneers have developed to support risk assessment have survived into present-day scientific research and have proven surprisingly hard to break. The NOEC is the most obvious example of such a habit. The NOEC is the highest tested concentration that does not lead to significant deviation from the control response. The lack of a statistically significant effect, however, does not mean that there is no effect. The actual level of effect at the NOEC varies between individual tests; actually between nearly 0 and nearly 100% [1]. The problems with the NOEC concept are well known and have been discussed extensively in the open literature, with a peak in the 1990s (see the recent editorial of Landis and Chapman in Integrated Environmental Assessment and Management [2] and references therein). In 1998, a workshop of the Organisation for Economic Co-operation and Development (OECD) [3] concluded that the NOEC should be phased out completely. A resolution to the same effect came from the International Organization for Standardization (ISO). Clearly, the NOEC is a poor indicator of no effect, and because the level of effect at the NOEC varies so much between tests, it is also a poor relative measure of toxicity. With the surge of criticism in the 1990s, and the obvious nature of the disadvantages, one would expect the NOEC to be discarded by now. So what is the situation in regulatory settings? In recent ISO and OECD guidance on dose–response analysis [4], the phase out is stressed again, but guidance for NOEC derivation is still included because many regulatory frameworks still rely on this metric. In recent guidance of the European Chemicals Agency (ECHA) [5], the problems with the NOEC are recognized clearly and the phase out recommendation of the OECD is mentioned. Furthermore, the guidance states that for long-term studies, ‘‘EC10. . .will be used preferentially.’’ Clearly, the NOEC has not been banned yet, but at least its shortcomings are being recognized in these frameworks. What is the situation in our scientific community? I made a quick survey of the publications in Environmental Toxicology and Chemistry (ET&C) in 1990 and 2010 to see how 20 years of progress affects the use of the NOEC and similar concepts (lowest observed effect concentration [LOEC], no observed effect level [NOEL], and no observed adverse effect level [NOAEL]). The 1990 volume of ET&C includes 164 publications (including errata and letters) of which seven mention the NOEC (or a similar concept) in the main text. Of these, one is a letter (discussing an earlier article), one reports literature values, and five derive NOECs from experimental data. The 2010 volume of ET&C includes more than twice the number of publications in total (352). In 46 articles, the NOEC (or similar concept) is mentioned in the main text. Of these, three only mention the NOEC briefly, 16 mention or use literature NOECs, and the remaining 27 actually derive NOECs from experimental toxicity data. Interestingly, in two articles a small-effect concentration (derived using regression analysis) was used as NOEC, one derives the NOEC ‘‘graphically from average values,’’ which leaves 24 articles that relied on hypothesis testing. Only two articles add some critical comment to their use of the NOEC (but still conclude it is a good estimate for no effects), and I spotted only one analysis of statistical power. This small and admittedly limited analysis shows that the NOEC concept still occurs regularly in our scientific journal in 2010, and I am sure that ET&C is not an exception. Furthermore, there is no indication that the popularity of the NOEC has decreased over the last 20 years; this analysis even indicates the opposite trend. And finally, the absence of critique among the authors that use or derive NOECs is striking. How is this possible? It is true that regulatory contexts worldwide still rely on the NOEC. Changing a well-established risk assessment practice is a slow and painful process, with its own political and legal constraints. As scientists, however, we are not bound by the same constraints; we do not have to continue using the NOEC to study the effects of chemicals. Surprisingly, the regulatory guidance of ISO, OECD, and ECHA reveals more of a critical attitude than our own scientific community. There is no scientific justification for the NOEC whatsoever, so why can’t we break this bad habit? Warne and Van Dam [6] suggested it is a lack of understanding the limitations, a lack of understanding the statistics, and the fact that no organization has clearly and categorically banned the NOEC. I would like to add to this list the lure of large numbers of existing NOEC values in the literature and in databases. However, the problems associated with the NOEC should force us to extreme caution. If there is a solid reason to report an NOEC (which I fail to see), its calculation should be accompanied by a rigorous statistical treatment, as is necessary for any type of hypothesis testing [7]. Using NOECs from the literature or databases for quantitative structure–activity relationships (QSARs) and speciessensitivity distributions (SSDs) is of limited scientific value. Even though these approaches certainly have their merits, it is unlikely that any useful structure can be built on such shaky foundations. Such analyses should thus, at the very least, be Environmental Toxicology and Chemistry, Vol. 31, No. 2, pp. 228–229, 2012 # 2011 SETAC Printed in the USA DOI: 10.1002/etc.746
Environmental Toxicology and Chemistry | 2011
Roman Ashauer; Annika Agatz; Carlo Albert; Virginie Ducrot; Nika Galic; Jan C.M. Hendriks; Tjalling Jager; Andreas Kretschmann; Isabel O'Connor; M.N. Rubach; Anna Maija Nyman; Walter Schmitt; Julita Stadnicka; Paul J. Van den Brink; Thomas G. Preuss
We report on the advantages and problems of using toxicokinetic-toxicodynamic (TKTD) models for the analysis, understanding, and simulation of sublethal effects. Only a few toxicodynamic approaches for sublethal effects are available. These differ in their effect mechanism and emphasis on linkages between endpoints. We discuss how the distinction between quantal and graded endpoints and the type of linkage between endpoints can guide model design and selection. Strengths and limitations of two main approaches and possible ways forward are outlined.
The American Naturalist | 2013
Benjamin T. Martin; Tjalling Jager; Roger M. Nisbet; Thomas G. Preuss; Volker Grimm
Individual-based models (IBMs) are increasingly used to link the dynamics of individuals to higher levels of biological organization. Still, many IBMs are data hungry, species specific, and time-consuming to develop and analyze. Many of these issues would be resolved by using general theories of individual dynamics as the basis for IBMs. While such theories have frequently been examined at the individual level, few cross-level tests exist that also try to predict population dynamics. Here we performed a cross-level test of dynamic energy budget (DEB) theory by parameterizing an individual-based model using individual-level data of the water flea, Daphnia magna, and comparing the emerging population dynamics to independent data from population experiments. We found that DEB theory successfully predicted population growth rates and peak densities but failed to capture the decline phase. Further assumptions on food-dependent mortality of juveniles were needed to capture the population dynamics after the initial population peak. The resulting model then predicted, without further calibration, characteristic switches between small- and large-amplitude cycles, which have been observed for Daphnia. We conclude that cross-level tests help detect gaps in current individual-level theories and ultimately will lead to theory development and the establishment of a generic basis for individual-based models and ecology.