David J. Frame
Victoria University of Wellington
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David J. Frame.
Nature | 2009
Malte Meinshausen; Nicolai Meinshausen; William Hare; Sarah Raper; Katja Frieler; Reto Knutti; David J. Frame; Myles R. Allen
More than 100 countries have adopted a global warming limit of 2 °C or below (relative to pre-industrial levels) as a guiding principle for mitigation efforts to reduce climate change risks, impacts and damages. However, the greenhouse gas (GHG) emissions corresponding to a specified maximum warming are poorly known owing to uncertainties in the carbon cycle and the climate response. Here we provide a comprehensive probabilistic analysis aimed at quantifying GHG emission budgets for the 2000–50 period that would limit warming throughout the twenty-first century to below 2 °C, based on a combination of published distributions of climate system properties and observational constraints. We show that, for the chosen class of emission scenarios, both cumulative emissions up to 2050 and emission levels in 2050 are robust indicators of the probability that twenty-first century warming will not exceed 2 °C relative to pre-industrial temperatures. Limiting cumulative CO2 emissions over 2000–50 to 1,000 Gt CO2 yields a 25% probability of warming exceeding 2 °C—and a limit of 1,440 Gt CO2 yields a 50% probability—given a representative estimate of the distribution of climate system properties. As known 2000–06 CO2 emissions were ∼234 Gt CO2, less than half the proven economically recoverable oil, gas and coal reserves can still be emitted up to 2050 to achieve such a goal. Recent G8 Communiqués envisage halved global GHG emissions by 2050, for which we estimate a 12–45% probability of exceeding 2 °C—assuming 1990 as emission base year and a range of published climate sensitivity distributions. Emissions levels in 2020 are a less robust indicator, but for the scenarios considered, the probability of exceeding 2 °C rises to 53–87% if global GHG emissions are still more than 25% above 2000 levels in 2020.
Nature | 2005
David A. Stainforth; Tolu Aina; Claus Lynge Christensen; Matthew D. Collins; N. E. Faull; David J. Frame; J. A. Kettleborough; Sylvia H. E. Knight; Andrew R. Martin; J. M. Murphy; C. Piani; D. Sexton; Leonard A. Smith; Robert A. Spicer; A. J. Thorpe; Myles R. Allen
The range of possibilities for future climate evolution needs to be taken into account when planning climate change mitigation and adaptation strategies. This requires ensembles of multi-decadal simulations to assess both chaotic climate variability and model response uncertainty. Statistical estimates of model response uncertainty, based on observations of recent climate change, admit climate sensitivities—defined as the equilibrium response of global mean temperature to doubling levels of atmospheric carbon dioxide—substantially greater than 5 K. But such strong responses are not used in ranges for future climate change because they have not been seen in general circulation models. Here we present results from the ‘climateprediction.net’ experiment, the first multi-thousand-member grand ensemble of simulations using a general circulation model and thereby explicitly resolving regional details. We find model versions as realistic as other state-of-the-art climate models but with climate sensitivities ranging from less than 2 K to more than 11 K. Models with such extreme sensitivities are critical for the study of the full range of possible responses of the climate system to rising greenhouse gas levels, and for assessing the risks associated with specific targets for stabilizing these levels.
Nature | 2009
Myles R. Allen; David J. Frame; Chris Huntingford; Chris Jones; Jason Lowe; Malte Meinshausen; Nicolai Meinshausen
Global efforts to mitigate climate change are guided by projections of future temperatures. But the eventual equilibrium global mean temperature associated with a given stabilization level of atmospheric greenhouse gas concentrations remains uncertain, complicating the setting of stabilization targets to avoid potentially dangerous levels of global warming. Similar problems apply to the carbon cycle: observations currently provide only a weak constraint on the response to future emissions. Here we use ensemble simulations of simple climate-carbon-cycle models constrained by observations and projections from more comprehensive models to simulate the temperature response to a broad range of carbon dioxide emission pathways. We find that the peak warming caused by a given cumulative carbon dioxide emission is better constrained than the warming response to a stabilization scenario. Furthermore, the relationship between cumulative emissions and peak warming is remarkably insensitive to the emission pathway (timing of emissions or peak emission rate). Hence policy targets based on limiting cumulative emissions of carbon dioxide are likely to be more robust to scientific uncertainty than emission-rate or concentration targets. Total anthropogenic emissions of one trillion tonnes of carbon (3.67 trillion tonnes of CO2), about half of which has already been emitted since industrialization began, results in a most likely peak carbon-dioxide-induced warming of 2 °C above pre-industrial temperatures, with a 5–95% confidence interval of 1.3–3.9 °C.
Nature | 2006
Gabriele C. Hegerl; Thomas J. Crowley; William T. Hyde; David J. Frame
There is a Brief Communications Arising (01 March 2007) associated with this documentThe magnitude and impact of future global warming depends on the sensitivity of the climate system to changes in greenhouse gas concentrations. The commonly accepted range for the equilibrium global mean temperature change in response to a doubling of the atmospheric carbon dioxide concentration, termed climate sensitivity, is 1.5–4.5 K (ref. 2). A number of observational studies, however, find a substantial probability of significantly higher sensitivities, yielding upper limits on climate sensitivity of 7.7 K to above 9 K (refs 3–8). Here we demonstrate that such observational estimates of climate sensitivity can be tightened if reconstructions of Northern Hemisphere temperature over the past several centuries are considered. We use large-ensemble energy balance modelling and simulate the temperature response to past solar, volcanic and greenhouse gas forcing to determine which climate sensitivities yield simulations that are in agreement with proxy reconstructions. After accounting for the uncertainty in reconstructions and estimates of past external forcing, we find an independent estimate of climate sensitivity that is very similar to those from instrumental data. If the latter are combined with the result from all proxy reconstructions, then the 5–95 per cent range shrinks to 1.5–6.2 K, thus substantially reducing the probability of very high climate sensitivity.
Nature | 2009
Malte Meinshausen; Nicolai Meinshausen; William Hare; Sarah Raper; Katja Frieler; Reto Knutti; David J. Frame; Myles R. Allen
More than 100 countries have adopted a global warming limit of 2 °C or below (relative to pre-industrial levels) as a guiding principle for mitigation efforts to reduce climate change risks, impacts and damages. However, the greenhouse gas (GHG) emissions corresponding to a specified maximum warming are poorly known owing to uncertainties in the carbon cycle and the climate response. Here we provide a comprehensive probabilistic analysis aimed at quantifying GHG emission budgets for the 2000–50 period that would limit warming throughout the twenty-first century to below 2 °C, based on a combination of published distributions of climate system properties and observational constraints. We show that, for the chosen class of emission scenarios, both cumulative emissions up to 2050 and emission levels in 2050 are robust indicators of the probability that twenty-first century warming will not exceed 2 °C relative to pre-industrial temperatures. Limiting cumulative CO2 emissions over 2000–50 to 1,000 Gt CO2 yields a 25% probability of warming exceeding 2 °C—and a limit of 1,440 Gt CO2 yields a 50% probability—given a representative estimate of the distribution of climate system properties. As known 2000–06 CO2 emissions were ∼234 Gt CO2, less than half the proven economically recoverable oil, gas and coal reserves can still be emitted up to 2050 to achieve such a goal. Recent G8 Communiqués envisage halved global GHG emissions by 2050, for which we estimate a 12–45% probability of exceeding 2 °C—assuming 1990 as emission base year and a range of published climate sensitivity distributions. Emissions levels in 2020 are a less robust indicator, but for the scenarios considered, the probability of exceeding 2 °C rises to 53–87% if global GHG emissions are still more than 25% above 2000 levels in 2020.
Journal of Climate | 2011
Duncan Ackerley; Ben B. B. Booth; Sylvia H. E. Knight; Eleanor J. Highwood; David J. Frame; Myles R. Allen; David P. Rowell
AbstractA full understanding of the causes of the severe drought seen in the Sahel in the latter part of the twentieth-century remains elusive some 25 yr after the height of the event. Previous studies have suggested that this drying trend may be explained by either decadal modes of natural variability or by human-driven emissions (primarily aerosols), but these studies lacked a sufficiently large number of models to attribute one cause over the other. In this paper, signatures of both aerosol and greenhouse gas changes on Sahel rainfall are illustrated. These idealized responses are used to interpret the results of historical Sahel rainfall changes from two very large ensembles of fully coupled climate models, which both sample uncertainties arising from internal variability and model formulation. The sizes of these ensembles enable the relative role of human-driven changes and natural variability on historic Sahel rainfall to be assessed. The paper demonstrates that historic aerosol changes are likely t...
Proceedings of the National Academy of Sciences of the United States of America | 2010
Kirsten Zickfeld; M. Granger Morgan; David J. Frame; David W. Keith
There is uncertainty about the response of the climate system to future trajectories of radiative forcing. To quantify this uncertainty we conducted face-to-face interviews with 14 leading climate scientists, using formal methods of expert elicitation. We structured the interviews around three scenarios of radiative forcing stabilizing at different levels. All experts ranked “cloud radiative feedbacks” as contributing most to their uncertainty about future global mean temperature change, irrespective of the specified level of radiative forcing. The experts disagreed about the relative contribution of other physical processes to their uncertainty about future temperature change. For a forcing trajectory that stabilized at 7 Wm-2 in 2200, 13 of the 14 experts judged the probability that the climate system would undergo, or be irrevocably committed to, a “basic state change” as ≥0.5. The width and median values of the probability distributions elicited from the different experts for future global mean temperature change under the specified forcing trajectories vary considerably. Even for a moderate increase in forcing by the year 2050, the medians of the elicited distributions of temperature change relative to 2000 range from 0.8–1.8 °C, and some of the interquartile ranges do not overlap. Ten of the 14 experts estimated that the probability that equilibrium climate sensitivity exceeds 4.5 °C is > 0.17, our interpretation of the upper limit of the “likely” range given by the Intergovernmental Panel on Climate Change. Finally, most experts anticipated that over the next 20 years research will be able to achieve only modest reductions in their degree of uncertainty.
Journal of Climate | 2008
Benjamin M. Sanderson; Reto Knutti; Tolu Aina; Carl Christensen; N. E. Faull; David J. Frame; William Ingram; Claudio Piani; David A. Stainforth; Dáithí A. Stone; Myles R. Allen
A climate model emulator is developed using neural network techniques and trained with the data from the multithousand-member climateprediction.net perturbed physics GCM ensemble. The method recreates nonlinear interactions between model parameters, allowing a simulation of a much larger ensemble that explores model parameter space more fully. The emulated ensemble is used to search for models closest to observations over a wide range of equilibrium response to greenhouse gas forcing. The relative discrepancies of these models from observations could be used to provide a constraint on climate sensitivity. The use of annual mean or seasonal differences on top-of-atmosphere radiative fluxes as an observational error metric results in the most clearly defined minimum in error as a function of sensitivity, with consistent but less well-defined results when using the seasonal cycles of surface temperature or total precipitation. The model parameter changes necessary to achieve different values of climate sensitivity while minimizing discrepancy from observation are also considered and compared with previous studies. This information is used to propose more efficient parameter sampling strategies for future ensembles.
Philosophical Transactions of the Royal Society A | 2007
David J. Frame; N. E. Faull; Manoj Joshi; Myles R. Allen
The development of ensemble-based ‘probabilistic’ climate forecasts is often seen as a promising avenue for climate scientists. Ensemble-based methods allow scientists to produce more informative, nuanced forecasts of climate variables by reflecting uncertainty from various sources, such as similarity to observation and model uncertainty. However, these developments present challenges as well as opportunities, particularly surrounding issues of experimental design and interpretation of forecast results. This paper discusses different approaches and attempts to set out what climateprediction.net and other large ensemble, complex model experiments might contribute to this research programme.
Journal of Geophysical Research | 2008
Reto Knutti; Stefan Krähenmann; David J. Frame; Myles R. Allen
[1] Schwartz [2007] (hereinafter referred to as SES) recently suggested a method to calculate equilibrium climate sensitivity (the equilibrium global surface warming for a doubling of the atmospheric CO2 concentration), the effective heat capacity of the Earth’s climate system, the temperature response time scale relevant to climate change, an estimate of total radiative forcing as well as the magnitude of the aerosol forcing over the 20th century. The main results are that the characteristic response time scale of global temperature is 5 ± 1 years and climate sensitivity is 0.30 ± 0.14 K/(W m ), corresponding to an equilibrium temperature increase for doubling atmospheric CO2 of 1.1 ± 0.5 K. In practical terms, this means that global surface temperature is nearly in equilibrium with radiative forcing, and that the sum of all feedbacks (water vapor, lapse rate, clouds, albedo) is close to zero. These results are at odds with most of the current literature on climate sensitivity, the idea of commitment warming and with the magnitudes of climate feedbacks quantified in models and observations. If true, the low climate sensitivity would allow for much higher atmospheric greenhouse gas concentrations to be consistent with a given stabilization temperature compared to the current consensus. It would also imply that stabilization of atmospheric CO2 would lead to stabilization of global temperatures within a few years. [2] In simple terms, SES uses an energy balance argument to claim that climate sensitivity S is given by S = t/C, where C is the effective heat capacity of the Earth and t is a characteristic time scale. The effective heat capacity is obtained by a regression of observed global ocean heat uptake versus global surface temperature (as done in several previous studies), while t is obtained from the autocorrelation of the observed linearly detrended global surface temperature time series. Details are discussed by Schwartz [2007].