Daniel Lincke
Potsdam Institute for Climate Impact Research
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Daniel Lincke.
Proceedings of the National Academy of Sciences of the United States of America | 2014
Jochen Hinkel; Daniel Lincke; Athanasios T. Vafeidis; Mahé Perrette; Robert J. Nicholls; Richard S.J. Tol; Ben Marzeion; Xavier Fettweis; Cezar Ionescu; Anders Levermann
Significance Coastal flood damages are expected to increase significantly during the 21st century as sea levels rise and socioeconomic development increases the number of people and value of assets in the coastal floodplain. Estimates of future damages and adaptation costs are essential for supporting efforts to reduce emissions driving sea-level rise as well as for designing strategies to adapt to increasing coastal flood risk. This paper presents such estimates derived by taking into account a wide range of uncertainties in socioeconomic development, sea-level rise, continental topography data, population data, and adaptation strategies. Coastal flood damage and adaptation costs under 21st century sea-level rise are assessed on a global scale taking into account a wide range of uncertainties in continental topography data, population data, protection strategies, socioeconomic development and sea-level rise. Uncertainty in global mean and regional sea level was derived from four different climate models from the Coupled Model Intercomparison Project Phase 5, each combined with three land-ice scenarios based on the published range of contributions from ice sheets and glaciers. Without adaptation, 0.2–4.6% of global population is expected to be flooded annually in 2100 under 25–123 cm of global mean sea-level rise, with expected annual losses of 0.3–9.3% of global gross domestic product. Damages of this magnitude are very unlikely to be tolerated by society and adaptation will be widespread. The global costs of protecting the coast with dikes are significant with annual investment and maintenance costs of US
International Journal of Climate Change Strategies and Management | 2013
Sarah Wolf; Jochen Hinkel; Mareen Hallier; Alexander Bisaro; Daniel Lincke; Cezar Ionescu; Richard J.T. Klein
12–71 billion in 2100, but much smaller than the global cost of avoided damages even without accounting for indirect costs of damage to regional production supply. Flood damages by the end of this century are much more sensitive to the applied protection strategy than to variations in climate and socioeconomic scenarios as well as in physical data sources (topography and climate model). Our results emphasize the central role of long-term coastal adaptation strategies. These should also take into account that protecting large parts of the developed coast increases the risk of catastrophic consequences in the case of defense failure.
Environmental Modelling and Software | 2013
Sarah Wolf; Steffen Fürst; Antoine Mandel; Wiebke Lass; Daniel Lincke; Federico Pablo-Martí; Carlo Jaeger
The purpose of this paper is to present a formal framework of vulnerability to climate change, to address the conceptual confusion around vulnerability and related concepts. The framework was developed using the method of formalisation – making structure explicit. While mathematics as a precise and general language revealed common structures in a large number of vulnerability definitions and assessments, the framework is here presented by diagrams for a non‐mathematical audience. Vulnerability, in ordinary language, is a measure of possible future harm. Scientific vulnerability definitions from the fields of climate change, poverty, and natural hazards share and refine this structure. While theoretical definitions remain vague, operational definitions, that is, methodologies for assessing vulnerability, occur in three distinct types: evaluate harm for projected future evolutions, evaluate the current capacity to reduce harm, or combine the two. The framework identifies a lack of systematic relationship between theoretical and operational definitions. While much conceptual literature tries to clarify vulnerability, formalisation is a new method in this interdisciplinary field. The resulting framework is an analytical tool which supports clear communication: it helps when making assumptions explicit. The mismatch between theoretical and operational definitions is not made explicit in previous work.
Earth’s Future | 2017
Sanne Muis; Martin Verlaan; Robert J. Nicholls; Sally Brown; Jochen Hinkel; Daniel Lincke; Athanasios T. Vafeidis; Paolo Scussolini; Hessel C. Winsemius; Philip J. Ward
This paper presents Lagom regiO: a multi-agent model of several growing economic areas in interaction. The model is part of the Lagom model family: economic multi-agent models developed to make steps toward understanding equilibrium selection and identifying win-win opportunities for climate policy. The particular feature of the model presented here is that it locates agents in one of a user-chosen number of regions. It can thus be used to represent diverse economic areas by specifying characteristics of agents and their interaction network as depending on their regions.
Frontiers in Marine Science | 2016
Claudia Wolff; Athanasios T. Vafeidis; Daniel Lincke; Christian Marasmi; Jochen Hinkel
Estimating the current risk of coastal flooding requires adequate information on extreme sea levels. For over a decade, the only global data available was the DINAS-COAST Extreme Sea Levels (DCESL) dataset, which applies a static approximation to estimate extreme sea levels. Recently, a dynamically derived dataset was developed: the Global Tide and Surge Reanalysis (GTSR) dataset. Here, we compare the two datasets. The differences between DCESL and GTSR are generally larger than the confidence intervals of GTSR. Compared to observed extremes, DCESL generally overestimates extremes with a mean bias of 0.6 m. With a mean bias of −0.2 m GTSR generally underestimates extremes, particularly in the tropics. The Dynamic Interactive Vulnerability Assessment model is applied to calculate the present-day flood exposure in terms of the land area and the population below the 1 in 100-year sea levels. Global exposed population is 28% lower when based on GTSR instead of DCESL. Considering the limited data available at the time, DCESL provides a good estimate of the spatial variation in extremes around the world. However, GTSR allows for an improved assessment of the impacts of coastal floods, including confidence bounds. We further improve the assessment of coastal impacts by correcting for the conflicting vertical datum of sea-level extremes and land elevation, which has not been accounted for in previous global assessments. Converting the extreme sea levels to the same vertical reference used for the elevation data is shown to be a critical step resulting in 39–59% higher estimate of population exposure.
Philosophical Transactions of the Royal Society A | 2018
Robert J. Nicholls; Sally Brown; Philip Goodwin; Thomas Wahl; Jason Lowe; Martin Solan; Jasmin A. Godbold; Ivan D. Haigh; Daniel Lincke; Jochen Hinkel; Claudia Wolff; Jan-Ludolf Merkens
This paper assess sea-level rise related coastal flood impacts for Emilia-Romagna (Italy) using the Dynamic Interactive Vulnerability Assessment (DIVA) modeling framework and investigate the sensitivity of the model to four uncertainty dimensions, namely (1) elevation, (2) population (3) vertical land movement (4) scale and resolution of assessment. A one-driver-at-a-time sensitivity approach is used in order to explore and quantify the effects of uncertainties in input data and assessment scale on model outputs. Of particular interest is the sensitivity of flood risk estimates when using datasets of different resolution. The change in assessment scale is implemented through the use of a more detailed digital coastline and input data for the coastline segmentation process. This change leads to a 35-fold increase in the number of coastal segments and in a more realistic spatial representation of coastal flood impacts for the Emilia-Romagna coast. Furthermore, the coastline length increases by 43%, considerably influencing adaptation costs (construction of dikes). With respect to input data our results show that by the end of the century coastal flood impacts are more sensitive to variations in elevation and vertical land movement data than to variations in population data in the study area. The inclusion of local information on human induced subsidence rates increases the relative sea-level by 60cm in 2100, resulting in coastal flood impacts that are up to 25% higher compared to those generated with the global DIVA values, which mainly account for natural processes. The choice of one elevation model over another can result in differences of approximately 45% of the coastal floodplain extent and up to 50% in flood damages by 2100. Our results emphasize that the scale of assessment and resolution of the input data can have significant implications for the results of coastal flood impact assessments. Understanding and communicating these implications is essential for effectively supporting decision makers in developing long-term robust and flexible adaptation plans for future changes of highly uncertain scale and direction.
conference on domain specific languages | 2009
Daniel Lincke; Patrik Jansson; Marcin Zalewski; Cezar Ionescu
The effectiveness of stringent climate stabilization scenarios for coastal areas in terms of reduction of impacts/adaptation needs and wider policy implications has received little attention. Here we use the Warming Acidification and Sea Level Projector Earth systems model to calculate large ensembles of global sea-level rise (SLR) and ocean pH projections to 2300 for 1.5°C and 2.0°C stabilization scenarios, and a reference unmitigated RCP8.5 scenario. The potential consequences of these projections are then considered for global coastal flooding, small islands, deltas, coastal cities and coastal ecology. Under both stabilization scenarios, global mean ocean pH (and temperature) stabilize within a century. This implies significant ecosystem impacts are avoided, but detailed quantification is lacking, reflecting scientific uncertainty. By contrast, SLR is only slowed and continues to 2300 (and beyond). Hence, while coastal impacts due to SLR are reduced significantly by climate stabilization, especially after 2100, potential impacts continue to grow for centuries. SLR in 2300 under both stabilization scenarios exceeds unmitigated SLR in 2100. Therefore, adaptation remains essential in densely populated and economically important coastal areas under climate stabilization. Given the multiple adaptation steps that this will require, an adaptation pathways approach has merits for coastal areas. This article is part of the theme issue ‘The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels’.
Earth’s Future | 2018
Sally Brown; Robert J. Nicholls; Philip Goodwin; Ivan D. Haigh; Daniel Lincke; Athanasios T. Vafeidis; Jochen Hinkel
A class of closely related problems, a problem domain, can often be described by a domain-specific language, which consists of algorithms and combinators useful for solving that particular class of problems. Such a language can be of two kinds: it can form a new language or it can be embedded as a sublanguage in an existing one. We describe an embedded DSL in the form of a library which extends a general purpose language. Our domain is that of vulnerability assessment in the context of climate change, formally described at the Potsdam Institute for Climate Impact Research. The domain is described using Haskell, yielding a domain specific sublanguage of Haskell that can be used for prototyping of implementations. In this paper we present a generic C++ library that implements a domain-specific language for vulnerability assessment, based on the formal Haskell description. The library rests upon and implements only a few notions, most importantly, that of a monadic system, a crucial part in the vulnerability assessment formalisation. We describe the Haskell description of monadic systems and we show our mapping of the description to generic C++ components. Our library heavily relies on concepts , a C++ feature supporting generic programming: a conceptual framework forms the domain-specific type system of our library. By using functions, parametrised types and concepts from our conceptual framework, we represent the combinators and algorithms of the domain. Furthermore, we discuss what makes our library a domain specific language and how our domain-specific library scheme can be used for other domains (concerning language design, software design, and implementation techniques).
workshop on generic programming | 2009
Daniel Lincke; Sibylle Schupp
We use multiple synthetic mitigation sea-level scenarios, together with a non-mitigation sea-level scenario from the Warming Acidification and Sea-level Projector model. We find sea-level rise continues to accelerate post 2100 for all but the most aggressive mitigation scenarios indicative of 1.5°C and 2.0°C. Using the Dynamic Interactive Vulnerability Assessment modelling framework, we project land and population exposed in the 1 in 100 year coastal flood plain under sea-level rise and population change. In 2000, the flood plain is estimated at 540 x103 km2. By 2100, under the mitigation scenarios, it ranges between 610 x103 km2 and 640 x103 km2 [580 x103 km2 and 700 x103 km2 for the 5th and 95th percentiles]. Thus differences between the mitigation scenarios are small in 2100. However, in 2300, flood plains are projected to increase to between 700 x103 km2 and 960 x103 km2 in 2300 [610 x103 km2 and 1,290 x103 km2] for the mitigation scenarios, but 1,630 x103 km2 [1,190 x103 km2 and 2,220 x103 km2] for the non-mitigation scenario. The proportion of global population exposed to sea-level rise in 2300 is projected to be between 1.5% and 5.4% [1.2% to 7.6%] (assuming no population growth after 2100) for the aggressive mitigation and the non-mitigation scenario, respectively. Hence over centennial timescales there are significant benefits to climate change mitigation and temperature stabilization. However, sea-levels will continue to rise albeit at lower rates. Thus potential impacts will keep increasing necessitating adaptation to existing coastal infrastructure and the careful planning of new coastal developments. Plain Language Summary If we reduce greenhouse gas emissions and stabilize global temperatures, sea‐level rise (SLR) will continue at a reduced rate for centuries. This is because changes to the ocean and cryosphere (ice) which contribute to SLR take very long timescales to respond to changes in global warming. Early and aggressive climate change mitigation will be most effective to reduce flood risk, particularly after the 21st century. Even with climate change mitigation, the land area exposed to coastal flooding will continue to increase for centuries. Adapting the coast to cope with rising sea levels is inevitably required. The long‐term implications for coastal habitation need to be considered.
Nature | 2018
Mark Schuerch; T. Spencer; Stijn Temmerman; Matthew L. Kirwan; Claudia Wolff; Daniel Lincke; Chris McOwen; Mark Pickering; Ruth Reef; Athanasios T. Vafeidis; Jochen Hinkel; Robert J. Nicholls; Sally Brown
Higher-order functions are essential for generic programming. While they are naturally supported in functional programming languages, there are no higher-order functions in C++. There, various function datatypes exist that simulate the effect of higher-order programming in different ways - as classes designed in the imperative, object-oriented, or meta-programming spirit. With the recent addition of concepts, another alternative for supporting parameterization by functions has yet become available. No guidelines exist, however, whether, or when, to prefer a concept for functions over any of the existing function datatypes; nor have the function datatypes themselves been compared. We provide an empirical study that assesses performance, expressivity, and convenience. The study shows that the function concept mechanism is faster and at least as expressive as the best function datatype, but, due to the principal difference between concept- and type-based approaches, also less convenient to use.