Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andrew Schurer is active.

Publication


Featured researches published by Andrew Schurer.


Environmental Research Letters | 2016

European summer temperatures since Roman times

Jürg Luterbacher; Johannes P. Werner; Jason E. Smerdon; Laura Fernández-Donado; Fidel González-Rouco; David Barriopedro; Fredrik Charpentier Ljungqvist; Ulf Büntgen; E. Zorita; S. Wagner; Jan Esper; Danny McCarroll; Andrea Toreti; David Frank; Johann H. Jungclaus; Mariano Barriendos; Chiara Bertolin; Oliver Bothe; Rudolf Brázdil; Dario Camuffo; Petr Dobrovolný; Mary Gagen; E. García-Bustamante; Quansheng Ge; Juan J. Gomez-Navarro; Joël Guiot; Zhixin Hao; Gabi Hegerl; Karin Holmgren; V.V. Klimenko

The spatial context is criticalwhen assessing present-day climate anomalies, attributing them to potential forcings and making statements regarding their frequency and severity in a long-term perspective. Recent international initiatives have expanded the number of high-quality proxy-records and developed new statistical reconstruction methods. These advances allow more rigorous regional past temperature reconstructions and, in turn, the possibility of evaluating climate models on policy-relevant, spatiotemporal scales. Here we provide a new proxy-based, annually-resolved, spatial reconstruction of the European summer (June-August) temperature fields back to 755 CE based on Bayesian hierarchical modelling (BHM), together with estimates of the European mean temperature variation since 138 BCE based on BHM and composite-plus-scaling (CPS). Our reconstructions compare well with independent instrumental and proxy-based temperature estimates, but suggest a larger amplitude in summer temperature variability than previously reported. Both CPS and BHM reconstructions indicate that the mean 20th century European summer temperature was not significantly different from some earlier centuries, including the 1st, 2nd, 8th and 10th centuries CE. The 1st century (in BHM also the 10th century) may even have been slightly warmer than the 20th century, but the difference is not statistically significant. Comparing each 50 yr period with the 1951-2000 period reveals a similar pattern. Recent summers, however, have been unusually warm in the context of the last two millennia and there are no 30 yr periods in either reconstruction that exceed the mean average European summer temperature of the last 3 decades (1986-2015 CE). A comparison with an ensemble of climate model simulations suggests that the reconstructed European summer temperature variability over the period 850-2000 CE reflects changes in both internal variability and external forcing on multi-decadal time-scales. For pan-European temperatures we find slightly better agreement between the reconstruction and the model simulations with high-end estimates for total solar irradiance. Temperature differences between the medieval period, the recent period and the Little Ice Age are larger in the reconstructions than the simulations. This may indicate inflated variability of the reconstructions, a lack of sensitivity and processes to changes in external forcing on the simulated European climate and/or an underestimation of internal variability on centennial and longer time scales.


Journal of Climate | 2013

Separating Forced from Chaotic Climate Variability over the Past Millennium

Andrew Schurer; Gabriele C. Hegerl; Michael E. Mann; Simon F. B. Tett; Steven J. Phipps

Reconstructions of past climate show notable temperature variability over the past millennium, with relatively warm conditions during the Medieval Climate Anomaly (MCA) and a relatively cold Little Ice Age (LIA). Multimodel simulations of the past millennium are used together with a wide range of reconstructions of Northern Hemispheric mean annual temperature to separate climate variability from 850 to 1950 CE into components attributable to external forcing and internal climate variability. External forcing is found to contribute significantly to long-term temperature variations irrespective of the proxy reconstruction, particularly from 1400 onward. Over the MCA alone, however, the effect of forcing is only detectable in about half of the reconstructions considered, and the response to forcing in the models cannot explain the warm conditions around 1000 CE seen in some reconstructions. The residual from the detection analysis is used to estimate internal variability independent from climate modeling, and it is found that the recent observed 50and100-yrhemispheric temperaturetrendsaresubstantiallylargerthananyoftheinternallygeneratedtrends even using the large residuals over the MCA. Variations in solar output and explosive volcanism are found to be the main drivers of climate change from 1400 to 1900, but for the first time a significant contribution from greenhouse gas variations to the cold conditions during 1600‐1800 is also detected. The proxy reconstructions tend to show a smaller forced response than is simulated by the models. This discrepancy is shown, at least partly, to be likely associated with the difference in the response to large volcanic eruptions between reconstructions and model simulations.


Bulletin of the American Meteorological Society | 2017

Estimating Changes in Global Temperature since the Preindustrial Period

Ed Hawkins; Pablo Ortega; Emma B. Suckling; Andrew Schurer; Gabi Hegerl; Phil D. Jones; Manoj Joshi; Timothy J. Osborn; Valérie Masson-Delmotte; Juliette Mignot; Peter W. Thorne; Geert Jan van Oldenborgh

The United Nations Framework Convention on Climate Change (UNFCCC) process agreed in Paris to limit global surface temperature rise to “well below 2°C above pre-industrial levels.” But what period is preindustrial? Somewhat remarkably, this is not defined within the UNFCCC’s many agreements and protocols. Nor is it defined in the IPCC’s Fifth Assessment Report (AR5) in the evaluation of when particular temperature levels might be reached because no robust definition of the period exists. Here we discuss the important factors to consider when defining a preindustrial period, based on estimates of historical radiative forcings and the availability of climate observations. There is no perfect period, but we suggest that 1720–1800 is the most suitable choice when discussing global temperature limits. We then estimate the change in global average temperature since preindustrial using a range of approaches based on observations, radiative forcings, global climate model simulations, and proxy evidence. Our assessment is that this preindustrial period was likely 0.55°–0.80°C cooler than 1986–2005 and that 2015 was likely the first year in which global average temperature was more than 1°C above preindustrial levels. We provide some recommendations for how this assessment might be improved in the future and suggest that reframing temperature limits with a modern baseline would be inherently less uncertain and more policy relevant.


Geophysical Research Letters | 2016

The importance of ENSO phase during volcanic eruptions for detection and attribution

Flavio Lehner; Andrew Schurer; Gabriele C. Hegerl; Clara Deser; Thomas L. Frölicher

Comparisons of the observed global-scale cooling following recent volcanic eruptions to that simulated by climate models from the Coupled Model Intercomparison Project 5 (CMIP5) indicate that the models overestimate the magnitude of the global temperature response to volcanic eruptions. Here we show that this overestimation can be explained as a sampling issue, arising because all large eruptions since 1951 coincided with El Nino events, which cause global-scale warming that partially counteracts the volcanically induced cooling. By subsampling the CMIP5 models according to the observed El Nino–Southern Oscillation (ENSO) phase during each eruption, we find that the simulated global temperature response to volcanic forcing is consistent with observations. Volcanic eruptions pose a particular challenge for the detection and attribution methodology, as their surface impacts are short-lived and hence can be confounded by ENSO. Our results imply that detection and attribution studies must carefully consider sampling biases due to internal climate variability.


Geophysical Research Letters | 2015

Determining the likelihood of pauses and surges in global warming

Andrew Schurer; Gabriele C. Hegerl; Stephen Obrochta

The recent warming “hiatus” is subject to intense interest, with proposed causes including natural forcing and internal variability. Here we derive samples of all natural and internal variability from observations and a recent proxy reconstruction to investigate the likelihood that these two sources of variability could produce a hiatus or rapid warming in surface temperature. The likelihood is found to be consistent with that calculated previously for models and exhibits a similar spatial pattern, with an Interdecadal Pacific Oscillation-like structure, although with more signal in the Atlantic than in model patterns. The number and length of events increases if natural forcing is also considered, particularly in the models. From the reconstruction it can be seen that large eruptions, such as Mount Tambora in 1815, or clusters of eruptions, may result in a hiatus of over 20 years, a finding supported by model results.


Monthly Notices of the Royal Astronomical Society | 2014

grasil-3d: an implementation of dust effects in the SEDs of simulated galaxies

Rosa Dominguez-Tenreiro; A. Obreja; G. L. Granato; Andrew Schurer; Paula Alpresa; L. Silva; Chris B. Brook; Arturo Serna

We introduce a new model for the spectral energy distribution of galaxies, GRASIL-3D, which includes a careful modelling of the dust component of the interstellar medium. GRASIL-3D is an entirely new model based on the formalism of an existing and widely applied spectrophotometric model, GRASIL, but specifically designed to be interfaced with galaxies with any arbitrarily given geometry, such as galaxies calculated by theoretical hydrodynamical galaxy formation codes. GRASIL-3D is designed to separately treat radiative transfer in molecular clouds and in the diffuse cirrus component. The code has a general applicability to the outputs of simulated galaxies, either from Lagrangian or Eulerian hydrodynamic codes. As an application, the new model has been interfaced to the P-DEVA and GASOLINE smoothed-particle hydrodynamic codes, and has been used to calculate the spectral energy distribution for a variety of simulated galaxies from UV to sub-millimeter wavelengths, whose comparison with observational data gives encouraging results. In addition, GRASIL-3D allows 2D images of such galaxies to be obtained, at several angles and in different bands.


Nature Climate Change | 2017

Importance of the pre-industrial baseline for likelihood of exceeding Paris goals

Andrew Schurer; Michael E. Mann; Ed Hawkins; Simon F. B. Tett; Gabriele C. Hegerl

During the Paris Conference in 2015, nations of the world strengthened the United Nations Framework Convention on Climate Change by agreeing to holding “the increase in the global average temperature to well below 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5°C”1. However, “pre-industrial” was not defined. Here we investigate the implications of different choices of the pre-industrial baseline on the likelihood of exceeding these two temperature thresholds. We find that for the strongest mitigation scenario RCP2.6 and a medium scenario RCP4.5 the probability of exceeding the thresholds and timing of exceedance is highly dependent on the pre-industrial baseline, for example the probability of crossing 1.5°C by the end of the century under RCP2.6, varies from 61% to 88% depending on how the baseline is defined. In contrast, in the scenario with no mitigation, RCP8.5, both thresholds will almost certainly be exceeded by the middle of the century with the definition of the pre-industrial baseline of less importance. Allowable carbon emissions for threshold stabilisation are similarly highly dependent on the pre-industrial baseline. For stabilisation at 2°C, allowable emissions decrease by as much as 40% when earlier than 19th century climates are considered as a baseline.


Monthly Notices of the Royal Astronomical Society | 2010

Modelling the spectral energy distribution of galaxies: introducing the artificial neural network

L. Silva; Andrew Schurer; G. L. Granato; C. Almeida; Carlton M. Baugh; Carlos S. Frenk; Cedric G. Lacey; L. Paoletti; A. Petrella; D. Selvestrel

The spectral energy distribution (SED) of galaxies is a complex function of the star formation history and geometrical arrangement of stars and gas in galaxies. The computation of the radiative transfer of stellar radiation through the dust distribution is time-consuming. This aspect becomes unacceptable in particular when dealing with the predictions by semi-analytical galaxy formation models populating cosmological volumes, to be then compared with multi-wavelength surveys. Mainly for this aim, we have implemented an artificial neural network (ANN) algorithm into the spectro-photometric and radiative transfer code GRASIL in order to compute the SED of galaxies in a short computing time. This allows to avoid the adoption of empirical templates that may have nothing to do with the mock galaxies output by models. The ANN has been implemented to compute the dust emission spectrum (the bottleneck of the computation), and separately for the star-forming molecular clouds (MC) and the diffuse dust (due to their different properties and dependencies). We have defined the input neurons effectively determining their emission, which means this implementation has a general applicability and is not linked to a particular galaxy formation model. We have trained the net for the disc and spherical geometries, and tested its performance to reproduce the SED of disc and starburst galaxies, as well as for a semi-analytical model for spheroidal galaxies. We have checked that for this model both the SEDs and the galaxy counts in the Herschel bands obtained with the ANN approximation are almost superimposed to the same quantities obtained with the full GRASIL. We conclude that this method appears robust and advantageous, and will present the application to a more complex SAM in another paper.


Nature Climate Change | 2017

Importance of the Pre-Industrial Baseline in Determining the Likelihood of Exceeding the Paris Limits

Andrew Schurer; Michael E. Mann; Ed Hawkins; Simon F. B. Tett; Gabriele C. Hegerl

During the Paris Conference in 2015, nations of the world strengthened the United Nations Framework Convention on Climate Change by agreeing to holding “the increase in the global average temperature to well below 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5°C”1. However, “pre-industrial” was not defined. Here we investigate the implications of different choices of the pre-industrial baseline on the likelihood of exceeding these two temperature thresholds. We find that for the strongest mitigation scenario RCP2.6 and a medium scenario RCP4.5 the probability of exceeding the thresholds and timing of exceedance is highly dependent on the pre-industrial baseline, for example the probability of crossing 1.5°C by the end of the century under RCP2.6, varies from 61% to 88% depending on how the baseline is defined. In contrast, in the scenario with no mitigation, RCP8.5, both thresholds will almost certainly be exceeded by the middle of the century with the definition of the pre-industrial baseline of less importance. Allowable carbon emissions for threshold stabilisation are similarly highly dependent on the pre-industrial baseline. For stabilisation at 2°C, allowable emissions decrease by as much as 40% when earlier than 19th century climates are considered as a baseline.


Nature Geoscience | 2018

Interpretations of the Paris climate target

Andrew Schurer; Kevin Cowtan; Ed Hawkins; Michael E. Mann; V. Scott; Simon F. B. Tett

In the 2015 UNFCCC Paris Agreement, article 2 expresses the target of “Holding the increase in global temperature to well below 2 °C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5 °C … recognizing that this would significantly reduce the risks and impacts of climate change”1. Different interpretations of the precise meaning of the phrases ‘increase in global temperature’2 and ‘pre-industrial’3 could have large effects on mitigation requirements and corresponding social, policy and political responses. Here we suggest that levels of current global mean surface warming since pre-industrial times that are higher than those derived by Millar et al. could have been calculated using alternative, but equally valid, assumptions as the ones made by those authors.

Collaboration


Dive into the Andrew Schurer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gabi Hegerl

University of Edinburgh

View shared research outputs
Top Co-Authors

Avatar

Michael E. Mann

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Malte Meinshausen

Potsdam Institute for Climate Impact Research

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ulf Büntgen

University of Cambridge

View shared research outputs
Researchain Logo
Decentralizing Knowledge