Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Samuel S. P. Shen is active.

Publication


Featured researches published by Samuel S. P. Shen.


Journal of Climate | 2012

Uncertainties, Trends, and Hottest and Coldest Years of U.S. Surface Air Temperature since 1895: An Update Based on the USHCN V2 TOB Data

Samuel S. P. Shen; Christine K. Lee; Jay H. Lawrimore

ThispaperestimatesthesamplingerrorvariancesofgriddedmonthlyU.S.HistoricalClimatologyNetwork, version 2 (USHCN V2), time-of-observation-biases (TOB)-adjusted data. The analysis of mean surface air temperature (SAT) assesses uncertainties, trends, and the rankings of the hottest and coldest years for the contiguous United States in the period of 1895‐2008. Data from the USHCN stations are aggregated onto a2.5 83 3.58latitude‐longitudegridbyanarithmeticmeanofthestationsinsideagridbox.Thesamplingerror variances of the gridded monthly data are estimated for every month and every grid box with data. The gridded data and their sampling error variances are used to calculate the contiguous U.S. averages and their trends and associated uncertainties. The sampling error variances are smaller (mostly less than 0.28C 2 ) over the eastern United States, where the station density is greater and larger (with values of 1.38C 2 for some grid boxes in the earlier period) over mountain and coastal areas. In the period of 1895‐2008, every month from January to December has a positive linear trend. February has the largest trend of 0.1628C (10 yr) 21 , and September has the smallest trend at 0.0208C (10 yr) 21 . The three hottest (coldest) years measured by the


Theoretical and Applied Climatology | 2015

Characteristics of the Tibetan Plateau snow cover variations based on daily data during 1997-2011

Samuel S. P. Shen; Ruzhen Yao; Jazlynn Ngo; Alan M. Basist; Neil Thomas; Tandong Yao

This paper studies the characteristics of snow cover variations for the Tibetan Plateau (TP) region (25°-45°N, 65°-105°E). The TP region’s daily snow cover data are a subset of the Interactive Multisensor Snow and Ice Mapping System (IMS) 24xa0km-by-24xa0km snow cover dataset for the entire Northern Hemisphere. A database of the daily snow cover for the TP region was developed from February 4, 1997–March 15, 2012. Thus, an animation of the TP snow cover was made as a data product of this research. The maximum percentage snow cover (67xa0%) occurred on February 6, 2008 and the minimum (0.5xa0%) on September 1, 2009. The average snow cover is 16xa0%. The seasonal cycle of the monthly TP snow cover reaches maximum in January (about 37xa0%) and minimum in August (2xa0%). The trend of the snow cover reduction is 4.0xa0% per decade, with a total reduction of 5.7xa0% from February 4, 1997 to March 15, 2012. The Hilbert–Huang Transform and Fourier spectral analyses indicate the existence of a cycle of TP snow cover having a period of 2–3 years. The histogram and higher statistics moment analyses imply that the positive skewness favors more spring snowstorms than spring droughts and the sharp peakedness at the climatology indicates the snow cover predictability by climate normals.


Advances in Atmospheric Sciences | 2014

Analysis of sampling error uncertainties and trends in maximum and minimum temperatures in China

Wei Hua; Samuel S. P. Shen; Huijun Wang

In this paper we report an analysis of sampling error uncertainties in mean maximum and minimum temperatures (Tmax and Tmin) carried out on monthly, seasonal and annual scales, including an examination of homogenized and original data collected at 731 meteorological stations across China for the period 1951–2004. Uncertainties of the gridded data and national average, linear trends and their uncertainties, as well as the homogenization effect on uncertainties are assessed. It is shown that the sampling error variances of homogenized Tmax and Tmin, which are larger in winter than in summer, have a marked northwest-southeast gradient distribution, while the sampling error variances of the original data are found to be larger and irregular. Tmax and Tmin increase in all months of the year in the study period 1951–2004, with the largest warming and uncertainties being 0.400°C (10 yr)−1 ± 0.269°C (10 yr)−1 and 0.578°C (10 yr)−1 ± 0.211°C (10 yr)−1 in February, and the least being 0.022°C (10 yr)−1 ± 0.085°C (10 yr)−1 and 0.104°C (10 yr)−1 ±0.070°C (10 yr)−1 in August. Homogenization can remove large uncertainties in the original records resulting from various non-natural changes in China.


Theoretical and Applied Climatology | 2016

Six temperature and precipitation regimes of the contiguous United States between 1895 and 2010: a statistical inference study

Samuel S. P. Shen; Olaf Wied; Alexander Weithmann; Tobias Regele; Barbara A. Bailey; Jay H. Lawrimore

This paper describes six different temporal climate regimes of the contiguous United States (CONUS) according to interdecadal variations of surface air temperature (SAT) and precipitation using the United States Historical Climatology Network (USHCN) monthly data (Tmax, Tmin, Tmean, and precipitation) from 1895 to 2010. Our analysis is based on the probability distribution, mean, standard deviation, skewness, kurtosis, Kolmogorov-Smirnov (KS) test, and Welch’s t test. The relevant statistical parameters are computed from gridded monthly SAT and precipitation data. SAT variations lead to classification of four regimes: 1895–1930 (cool), 1931–1960 (warm), 1961–1985 (cool), and 1986–2010 (warm), while precipitation variations lead to a classification of two regimes: 1895–1975 (dry) and 1976–2010 (wet). The KS test shows that any two regimes of the above six are statistically significantly different from each other due to clear shifts of the probability density functions. Extremes of SAT and precipitation identify the ten hottest, coldest, driest, and wettest years. Welch’s t test is used to discern significant differences among these extremes. The spatial patterns of the six climate regimes and some years of extreme climate are analyzed. Although the recent two decades are the warmest among the other decades since 1895 and many hottest years measured by CONUS Tmin and Tmean are in these two decades, the hottest year according to the CONUS Tmax anomalies is 1934 (1.37xa0°C), which is very close to the second Tmax hottest year 2006 (1.35xa0°C).


Advances in Atmospheric Sciences | 2012

An approach to quantify the heat wave strength and price a heat derivative for risk hedging

Samuel S. P. Shen; Benedikt Kramps; Shirley X. Sun; Barbara A. Bailey

Mitigating the heat stress via a derivative policy is a vital financial option for agricultural producers and other business sectors to strategically adapt to the climate change scenario. This study has provided an approach to identifying heat stress events and pricing the heat stress weather derivative due to persistent days of high surface air temperature (SAT). Cooling degree days (CDD) are used as the weather index for trade. In this study, a call-option model was used as an example for calculating the price of the index. Two heat stress indices were developed to describe the severity and physical impact of heat waves. The daily Global Historical Climatology Network (GHCN-D) SAT data from 1901 to 2007 from the southern California, USA, were used. A major California heat wave that occurred 20–25 October 1965 was studied. The derivative price was calculated based on the call-option model for both long-term station data and the interpolated grid point data at a regular 0.1°×0.1° latitude-longitude grid. The resulting comparison indicates that (a) the interpolated data can be used as reliable proxy to price the CDD and (b) a normal distribution model cannot always be used to reliably calculate the CDD price. In conclusion, the data, models, and procedures described in this study have potential application in hedging agricultural and other risks.


Theoretical and Applied Climatology | 2017

Estimation of sampling error uncertainties in observed surface air temperature change in China

Wei Hua; Samuel S. P. Shen; Alexander Weithmann; Huijun Wang

This study examines the sampling error uncertainties in the monthly surface air temperature (SAT) change in China over recent decades, focusing on the uncertainties of gridded data, national averages, and linear trends. Results indicate that large sampling error variances appear at the station-sparse area of northern and western China with the maximum value exceeding 2.0 K2 while small sampling error variances are found at the station-dense area of southern and eastern China with most grid values being less than 0.05 K2. In general, the negative temperature existed in each month prior to the 1980s, and a warming in temperature began thereafter, which accelerated in the early and mid-1990s. The increasing trend in the SAT series was observed for each month of the year with the largest temperature increase and highest uncertainty of 0.51 ± 0.29 K (10 year)−1 occurring in February and the weakest trend and smallest uncertainty of 0.13 ± 0.07 K (10 year)−1 in August. The sampling error uncertainties in the national average annual mean SAT series are not sufficiently large to alter the conclusion of the persistent warming in China. In addition, the sampling error uncertainties in the SAT series show a clear variation compared with other uncertainty estimation methods, which is a plausible reason for the inconsistent variations between our estimate and other studies during this period.


Theoretical and Applied Climatology | 2017

Spatiotemporal variations of the twentieth century Tibetan Plateau precipitation based on the monthly 2.5° reconstructed data

Samuel S. P. Shen; Gregori Clarke; Bo-Wen Shen; Tandong Yao

This paper studies the spatiotemporal variations of precipitation over the Tibetan Plateau (TP) region with latitude and longitude ranges of (25° N, 45° N) and (65° E, 105° E) of the twentieth century from January 1901–December 2000. A long-term (January 1901–December 2009) TP monthly precipitation dataset with 2.5° latitude-longitude resolution is generated in this paper using spectral optimal gridding (SOG) method. The method uses the Global Precipitation Climatology Center (GPCC) ground station data to anchor the basis of empirical orthogonal functions (EOFs) computed from the Global Precipitation Climatology Project (GPCP) data. Our gridding takes teleconnection into account and uses data from stations both within and outside of the TP region. While the annual total precipitation increased at an approximate rate of 2.6xa0mm per decade in the period of 1971–2000 exists, no significant increase of TP precipitation from 1901 to 2000 was found. Our rate is less than those of previous publications based only on the TP stations because our data consider the entire TP region, including desert and high-altitude areas. An analysis of extremes and spatiotemporal patterns of our data shows that our reconstructed data can properly quantify the reported disasters of flooding and droughts in India, Bangladesh, and China for the following events: flooding in 1988 and 1998 and drought in 1972. Our time-frequency analysis using the empirical mode decomposition method shows that our nonlinear trend agrees well with the linear trend in the period from 1971 to 2000. The spatiotemporal variation characteristics documented in this paper can help understand atmospheric circulations on TP precipitation and validate the TP precipitation in climate models.


Advances in Adaptive Data Analysis | 2015

Hilbert–Huang Transform Approach to Lorenz Signal Separation

Gregori J. Clarke; Samuel S. P. Shen

This study uses the Hilbert–Huang transform (HHT), a signal analysis method for nonlinear and non-stationary processes, to separate signals of varying frequencies in a nonlinear system governed by the Lorenz equations. Similar to the Fourier series expansion, HHT decomposes a data time series into a sum of intrinsic mode functions (IMFs) using empirical mode decomposition (EMD). Unlike an infinite number of Fourier series terms, the EMD always yields a finite number of IMFs, whose sum is equal to the original time series exactly. Using the HHT approach, the properties of the Lorenz attractor are interpreted in a time–frequency frame. This frame shows that: (i) the attractor is symmetric for z (i.e. invariant for z), even though the signs on x and y are changed; (ii) the attractor is sensitive to initial conditions even by a small perturbation, measured by the divergence of the trajectories over time; (iii) the Lorenz system goes through “windows” of chaos and periodicity; and (iv) at times, a system can b...


Advances in Adaptive Data Analysis | 2013

EOF-MSE ADAPTIVE METHOD TO ASSESS AN ACID DEPOSITION MONITORING NETWORK OVER ALBERTA, CANADA

Samuel S. P. Shen; Markus Bantle; Aaron S. Donahue; Raymond Wong; Christine K. Lee

This study provides an adaptive data analysis method that assesses Albertas acid deposition monitoring network of 9 stations and the relative importance of each station. The method is based on the assessment of the mean square error (MSE) of sampling expressed in terms of empirical orthogonal functions (EOF). The annual potential acid input (PAI) data of the 9 stations over Alberta, Canada are used in this study. The patterns of the EOFs and PCs (principal components) are analyzed to reflect the PAIs spatial-temporal distribution properties. The definition and minimization of the MSE are the basis for our assessment method. The mean PAI field in the period of 1993–2006 and the PAI fields of individual years demonstrate a strong spatial inhomogeneity of the PAI field over Alberta. The PAI level is high in the Red Deer–Calgary–Kananaskis corridor. Our optimal analysis indicates that the 9-station network is, in general, adequate in monitoring the overall PAI in Alberta. The network results in a small root-mean-square-error/standard-deviation ratio (5.6%), which demonstrates the reasonable effectiveness of the network. In the period of 14 years (1993–2006), there were only three years (1993, 1998, and 2002) during which the PAI values were higher than the monitoring load of 0.17 [keq H+ ha-1yr-1] at three locations: Red Deer, Calgary, and Kananaskis. According to a stations contribution to the reduction of sampling error, the descending order of importance for the 9 stations is as follows: Beaverlodge, Fort Chipewyan, Suffield, Red Deer, Cold Lake, Kananaskis, Calgary, Fort Vermilion, and Fort McMurray.


Theoretical and Applied Climatology | 2009

Factor analysis for El Niño signals in sea surface temperature and precipitation

Christine K. Lee; Samuel S. P. Shen; Barbara A. Bailey; Gerald R. North

Collaboration


Dive into the Samuel S. P. Shen's collaboration.

Top Co-Authors

Avatar

Barbara A. Bailey

San Diego State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christine K. Lee

San Diego State University

View shared research outputs
Top Co-Authors

Avatar

Jay H. Lawrimore

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar

Huijun Wang

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Tandong Yao

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Wei Hua

Chengdu University of Information Technology

View shared research outputs
Top Co-Authors

Avatar

Andreas J. Rupp

San Diego State University

View shared research outputs
Top Co-Authors

Avatar

B. Scott Strachan

San Diego State University

View shared research outputs
Top Co-Authors

Avatar

Benedikt Kramps

San Diego State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge