Martin P. Tingley
Pennsylvania State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Martin P. Tingley.
Journal of Climate | 2010
Martin P. Tingley; Peter John Huybers
Reconstructing the spatial pattern of a climate field through time from a dataset of overlapping instrumental and climate proxy time series is a nontrivial statistical problem. The need to transform the proxy observations into estimates of the climate field, and the fact that the observed time series are not uniformly distributed in space, further complicate the analysis. Current leading approaches to this problem are based on estimating the full covariance matrix between the proxy time series and instrumental time series over a ‘‘calibration’’ interval and then using this covariance matrix in the context of a linear regression to predict the missing instrumental values from the proxy observations for years prior to instrumental coverage. A fundamentally different approach to this problem is formulated by specifying parametric forms for the spatial covariance and temporal evolution of the climate field, as well as ‘‘observation equations’’ describing the relationship between the data types and the corresponding true values of the climate field. A hierarchical Bayesian model is used to assimilate both proxy and instrumental datasets and to estimate the probability distribution of all model parameters and the climate field through time on a regular spatial grid. The output from this approach includes an estimate of the full covariance structure of the climate field and model parameters as well as diagnostics that estimate the utility of the different proxy time series. This methodology is demonstrated using an instrumental surface temperature dataset after corrupting a number of the time series to mimic proxy observations. The results are compared to those achieved using the regularized expectation‐maximization algorithm, and in these experiments the Bayesian algorithm produces reconstructions with greater skill. The assumptions underlying these two methodologies and the results of applying each to simple surrogate datasets are explored in greater detail in Part II.
Nature | 2013
Martin P. Tingley; Peter John Huybers
Recently observed extreme temperatures at high northern latitudes are rare by definition, making the longer time span afforded by climate proxies important for assessing how the frequency of such extremes may be changing. Previous reconstructions of past temperature variability have demonstrated that recent warmth is anomalous relative to preceding centuries or millennia, but extreme events can be more thoroughly evaluated using a spatially resolved approach that provides an ensemble of possible temperature histories. Here, using a hierarchical Bayesian analysis of instrumental, tree-ring, ice-core and lake-sediment records, we show that the magnitude and frequency of recent warm temperature extremes at high northern latitudes are unprecedented in the past 600 years. The summers of 2005, 2007, 2010 and 2011 were warmer than those of all prior years back to 1400 (probability P > 0.95), in terms of the spatial average. The summer of 2010 was the warmest in the previous 600 years in western Russia (P > 0.99) and probably the warmest in western Greenland and the Canadian Arctic as well (P > 0.90). These and other recent extremes greatly exceed those expected from a stationary climate, but can be understood as resulting from constant space–time variability about an increased mean temperature.
Journal of Climate | 2010
Martin P. Tingley; Peter John Huybers
Abstract Part I presented a Bayesian algorithm for reconstructing climate anomalies in space and time (BARCAST). This method involves specifying simple parametric forms for the spatial covariance and temporal evolution of the climate field as well as “observation equations” describing the relationships between the data types and the corresponding true values of the climate field. As this Bayesian approach to reconstructing climate fields is new and different, it is worthwhile to compare it in detail to the more established regularized expectation–maximization (RegEM) algorithm, which is based on an empirical estimate of the joint data covariance matrix and a multivariate regression of the instrumental time series onto the proxy time series. The differing assumptions made by BARCAST and RegEM are detailed, and the impacts of these differences on the analysis are discussed. Key distinctions between BARCAST and RegEM include their treatment of spatial and temporal covariance, the prior information that enter...
Scientific Data | 2015
Jessica E. Tierney; Martin P. Tingley
Quantitative estimates of past temperature changes are a cornerstone of paleoclimatology. For a number of marine sediment-based proxies, the accuracy and precision of past temperature reconstructions depends on a spatial calibration of modern surface sediment measurements to overlying water temperatures. Here, we present a database of 1095 surface sediment measurements of TEX86, a temperature proxy based on the relative cyclization of marine archaeal glycerol dialkyl glycerol tetraether (GDGT) lipids. The dataset is archived in a machine-readable format with geospatial information, fractional abundances of lipids (if available), and metadata. We use this new database to update surface and subsurface temperature calibration models for TEX86 and demonstrate the applicability of the TEX86 proxy to past temperature prediction. The TEX86 database confirms that surface sediment GDGT distribution has a strong relationship to temperature, which accounts for over 70% of the variance in the data. Future efforts, made possible by the data presented here, will seek to identify variables with secondary relationships to GDGT distributions, such as archaeal community composition.
Scientific Reports | 2016
Michael E. Mann; Stefan Rahmstorf; Byron A. Steinman; Martin P. Tingley; Sonya K. Miller
2014 was nominally the warmest year on record for both the globe and northern hemisphere based on historical records spanning the past one and a half centuries1,2. It was the latest in a recent run of record temperatures spanning the past decade and a half. Press accounts reported odds as low as one-in-650 million that the observed run of global temperature records would be expected to occur in the absence of human-caused global warming. Press reports notwithstanding, the question of how likely observed temperature records may have have been both with and without human influence is interesting in its own right. Here we attempt to address that question using a semi-empirical approach that combines the latest (CMIP53) climate model simulations with observations of global and hemispheric mean temperature. We find that individual record years and the observed runs of record-setting temperatures were extremely unlikely to have occurred in the absence of human-caused climate change, though not nearly as unlikely as press reports have suggested. These same record temperatures were, by contrast, quite likely to have occurred in the presence of anthropogenic climate forcing.
Journal of Climate | 2014
Peter John Huybers; Karen A. McKinnon; Andrew Rhines; Martin P. Tingley
AbstractVariations in extreme daily temperatures are explored in relation to changes in seasonal mean temperature using 1218 high-quality U.S. temperature stations spanning 1900–2012. Extreme temperatures are amplified (or damped) by as much as ±50% relative to changes in average temperature, depending on region, season, and whether daily minimum or maximum temperature is analyzed. The majority of this regional structure in amplification is shown to follow from regional variations in temperature distributions. More specifically, there exists a close relationship between departures from normality and the degree to which extreme changes are amplified relative to the mean. To distinguish between intraseasonal and interannual contributions to nonnormality and amplification, an additional procedure, referred to as z bootstrapping, is introduced that controls for changes in the mean and variance between years. Application of z bootstrapping indicates that amplification of winter extreme variations is generally ...
Journal of Climate | 2012
Martin P. Tingley
Climate datasets with both spatial and temporal components are often studied after removing from each time series a temporal mean calculated over a common reference interval, which is generally shorter than the overall length of the dataset. The use of a short reference interval affects the temporal properties of the variability across the records, by reducing the standard deviation within the reference interval and inflating it elsewhere. For an annually averaged version of the Climate Research Unit’s (CRU) temperature anomaly product, the mean standard deviation is 0.678C within the 1961‐90 reference interval, and 0.818C elsewhere. The calculation of anomalies can be interpreted in terms of a two-factor analysis of variance model. Within a Bayesian inference framework, any missing values are viewed as additional parameters, and the reference interval is specified as the full length of the dataset. This Bayesian scheme is used to re-express the CRU dataset as anomalies with respect to means calculated over the entire 1850‐2009 interval spanned by the dataset. The mean standard deviation is increased to 0.698C within the original 1961‐90 reference interval, and reduced to 0.768C elsewhere. The choice of reference interval thus has a predictable and demonstrable effectonthesecondspatialmomenttimeseriesoftheCRUdataset.Thespatialmeantimeseriesisinthiscase largely unaffected: the amplitude of spatial mean temperature change is reduced by 0.18C when using the 1850‐2009 reference interval, while the 90% uncertainty interval of (20.03, 0.23) indicates that the reduction is not statistically significant.
Journal of Climate | 2012
Martin P. Tingley; Bo Li
In a recent paper, Bo Christiansen presents and discusses ‘‘LOC,’’ a methodology for reconstructing past climate that is based on local regressions between climate proxy time series and instrumental time series (Christiansen 2011, hereafter C11). LOC respects two important scientific facts about proxy data that are often overlooked, namely that many proxies are likely influenced by strictly local temperature, and, to reflect causality, the proxies should be written as functions of climate, not vice versa. There are, however, several weaknesses to the LOC method: uncertainty is not propagated through the multiple stages of the analysis, the effects of observational errors in the instrumental record are not considered, and as the proxies become uninformative of climate, the variance of a reconstruction produced by LOC becomes unbounded—a result that is clearly unphysical. These shortcomings can be overcome by interpreting the LOC method in the context of recently proposed Bayesian hierarchical reconstruction methods. Section 2 reviews the basic modeling assumptions underlying LOC and details the shortcomings of this approach. To illustrate one possible solution to the shortcomings of LOC, section 3 presents a Bayesian interpretation of LOC and briefly describes the connections between LOC and two recently published Bayesian reconstruction methods. Section 4 discusses the variance of the reconstructed series in both the original and Bayesian LOC frameworks, and section 5 provides a few concluding remarks.
The Annals of Applied Statistics | 2014
Luis Barboza; Bo Li; Martin P. Tingley; Frederi G. Viens
rst linking the latent temperature series to three main external forcings (solar irradiance, greenhouse gas concentration, and volcanism), and the second linking the observed temperature proxy data (tree rings, sediment record, ice cores, etc.) to the unobserved temperature series. Uncertainty is captured with additive noise, and a rigorous statistical investigation of the correlation structure in the regression errors motivates the use of long memory fractional Gaussian noise models for the error terms. We use Bayesian estimation to t the model parameters and to perform separate reconstructions of land-only and combined land-and-marine temperature anomalies. We quantify the eects of including the forcings and long memory models
Geophysical Research Letters | 2014
Martin P. Tingley; Alexander R. Stine; Peter John Huybers
The fidelity of inferences on volcanic cooling from tree-ring density records has recently come into question, with competing claims that temperature reconstructions based on tree-ring records underestimate cooling due to an increased likelihood of missing rings or overestimate cooling due to reduced light availability accentuating the response. Here we test these competing hypotheses in the latitudes poleward of 45 ◦ N, using the two eruptions occurring between 1850 and 1960 with large-scale Northern Hemisphere climatic effects: Novarupta (1912) and Krakatau (1883). We find that tree-ring densities overestimate postvolcanic cooling with respect to instrumental data (Probability ≥ 0.99), with larger magnitudes of bias where growth is more limited by light availability (Prob. ≥ 0.95). Using a methodology that allows for direct comparisons with instrumental data, our results confirm that high-latitude tree-ring densities record not only temperature but also variations in light availability.