Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Tucker McElroy is active.

Publication


Featured researches published by Tucker McElroy.


Econometric Reviews | 2012

A Review of Some Modern Approaches to the Problem of Trend Extraction

Theodore Alexandrov; Silvia Bianconcini; Estela Bee Dagum; Peter Maass; Tucker McElroy

This article presents a review of some modern approaches to trend extraction for one-dimensional time series, which is one of the major tasks of time series analysis. The trend of a time series is usually defined as a smooth additive component which contains information about the time series global change, and we discuss this and other definitions of the trend. We do not aim to review all the novel approaches, but rather to observe the problem from different viewpoints and from different areas of expertise. The article contributes to understanding the concept of a trend and the problem of its extraction. We present an overview of advantages and disadvantages of the approaches under consideration, which are: the model-based approach (MBA), nonparametric linear filtering, singular spectrum analysis (SSA), and wavelets. The MBA assumes the specification of a stochastic time series model, which is usually either an autoregressive integrated moving average (ARIMA) model or a state space model. The nonparametric filtering methods do not require specification of model and are popular because of their simplicity in application. We discuss the Henderson, LOESS, and Hodrick–Prescott filters and their versions derived by exploiting the Reproducing Kernel Hilbert Space methodology. In addition to these prominent approaches, we consider SSA and wavelet methods. SSA is widespread in the geosciences; its algorithm is similar to that of principal components analysis, but SSA is applied to time series. Wavelet methods are the de facto standard for denoising in signal procession, and recent works revealed their potential in trend analysis.


Econometric Theory | 2008

MATRIX FORMULAS FOR NONSTATIONARY ARIMA SIGNAL EXTRACTION

Tucker McElroy

The paper provides general matrix formulas for minimum mean squared error signal extraction for a finitely sampled time series whose signal and noise components are nonstationary autoregressive integrated moving average processes. These formulas are quite practical; in addition to being simple to implement on a computer, they make it possible to easily derive important general properties of the signal extraction filters. We also extend these formulas to estimates of future values of the unobserved signal, and we show how this result combines signal extraction and forecasting.


Journal of Time Series Analysis | 2012

Subsampling inference for the mean of heavy-tailed long-memory time series

Agnieszka Jach; Tucker McElroy; Dimitris N. Politis

In this article, we revisit a time series model introduced by MCElroy and Politis (2007a) and generalize it in several ways to encompass a wider class of stationary, nonlinear, heavy‐tailed time series with long memory. The joint asymptotic distribution for the sample mean and sample variance under the extended model is derived; the associated convergence rates are found to depend crucially on the tail thickness and long memory parameter. A self‐normalized sample mean that concurrently captures the tail and memory behaviour, is defined. Its asymptotic distribution is approximated by subsampling without the knowledge of tail or/and memory parameters; a result of independent interest regarding subsampling consistency for certain long‐range dependent processes is provided. The subsampling‐based confidence intervals for the process mean are shown to have good empirical coverage rates in a simulation study. The influence of block size on the coverage and the performance of a data‐driven rule for block size selection are assessed. The methodology is further applied to the series of packet‐counts from ethernet traffic traces.


Econometrics Journal | 2008

Exact Formulas for the Hodrick-Prescott Filter

Tucker McElroy

The Hodrick--Prescott (HP) filter is widely used in the field of economics to estimate trends and cycles from time series data. For certain applications--such as deriving implied trend and cycle models and obtaining filter weights--it is desirable to express the frequency response of the HP as the spectral density of an ARMA model; in other words, to accomplish the spectral factorization of the HP filter. This paper presents an exact approach to this problem, which makes it possible to provide exact algebraic formulas for the HP filter coefficients in terms of the HPs signal-to-noise ratio. Copyright Royal Economic Society 2008


Bayesian Analysis | 2009

A Bayesian Approach to Estimating the Long Memory Parameter

Scott H. Holan; Tucker McElroy; Sounak Chakraborty

We develop a Bayesian procedure for analyzing stationary long-range dependent processes. Specifically, we consider the fractional exponential model (FEXP) to estimate the memory parameter of a stationary long-memory Gaussian time series. Further, the method we propose is hierarchical and integrates over all possible models, thus reducing underestimation of uncertainty at the model-selection stage. Additionally, we establish Bayesian consistency of the memory parameter under mild conditions on the data process. Finally the suggested procedure is investigated on simulated and real data.


Econometric Theory | 2002

ROBUST INFERENCE FOR THE MEAN IN THE PRESENCE OF SERIAL CORRELATION AND HEAVY-TAILED DISTRIBUTIONS

Tucker McElroy; Dimitris N. Politis

The problem of statistical inference for the mean of a time series with possibly heavy tails is considered. We first show that the self-normalized sample mean has a well-defined asymptotic distribution. Subsampling theory is then used to develop asymptotically correct confidence intervals for the mean without knowledge (or explicit estimation) either of the dependence characteristics, or of the tail index. Using a symmetrization technique, we also construct a distribution estimator that combines robustness and accuracy: it is higher-order accurate in the regular case, while remaining consistent in the heavy tailed case. Some finite-sample simulations confirm the practicality of the proposed methods.


Annals of Statistics | 2007

COMPUTER-INTENSIVE RATE ESTIMATION, DIVERGING STATISTICS AND SCANNING

Tucker McElroy; Dimitris N. Politis

A general rate estimation method is proposed that is based on studying the in-sample evolution of appropriately chosen diverging/converging statis tics. The proposed rate estimators are based on simple least squares argu ments, and are shown to be accurate in a very general setting without requir ing the choice of a tuning parameter. The notion of scanning is introduced with the purpose of extracting useful subsamples of the data series; the pro posed rate estimation method is applied to different scans, and the resulting estimators are then combined to improve accuracy. Applications to heavy tail index estimation as well as to the problem of estimating the long memory pa rameter are discussed; a small simulation study complements our theoretical results.


Journal of Multivariate Analysis | 2009

A local spectral approach for assessing time series model misspecification

Tucker McElroy; Scott H. Holan

We consider band-limited frequency-domain goodness-of-fit testing for stationary time series, without smoothing or tapering the periodogram, while taking into account the effects of parameter uncertainty (from maximum-likelihood estimation). We are principally interested in modeling short econometric time series, typically with 100 to 150 observations, for which data-driven bandwidth selection procedures for kernel-smoothed spectral density estimates are unlikely to have adequate levels. Our mathematical results take parameter uncertainty directly into account, allowing us to obtain adequate level properties at small sample sizes. The main theorems provide very general results involving joint normality for linear functionals of powers of the periodogram, while accounting for parameter uncertainty, which can be used to determine the level and power of a wide array of statistics. We discuss several applications, such as spectral peak testing and testing for the inclusion of an Unobserved Component, and illustrate our methods on a time series from the Energy Information Administration.


Communications in Statistics-theory and Methods | 2008

Statistical Properties of Model-Based Signal Extraction Diagnostic Tests

Tucker McElroy

A model-based diagnostic test for signal extraction was first described in Maravall (2003), and this basic idea was modified and studied in Findley et al. (2004). This paper improves on the latter work in two ways: central limit theorems for the diagnostics are developed, and two hypothesis-testing paradigms for practical use are explicitly described. A further modified diagnostic provides an interpretation of one-sided rejection of the null hypothesis, yielding general notions of “over-modeling” and “under-modeling.” The new diagnostics are demonstrated on two U.S. Census Bureau time series exhibiting seasonality.


Journal of Time Series Analysis | 2012

Subsampling inference for the autocovariances and autocorrelations of long‐memory heavy‐ tailed linear time series

Tucker McElroy; Agnieszka Jach

We provide a self‐normalization for the sample autocovariances and autocorrelations of a linear, long‐memory time series with innovations that have either finite fourth moment or are heavy‐tailed with tail index 2 d and α, and which consequently lead to three different limit distributions; for the sample autocorrelation the limit distribution only depends on d. We introduce a self‐normalized sample autocovariance statistic, which is computable without knowledge of α or d (or their relationship), and which converges to a non‐degenerate distribution. We also treat self‐normalization of the autocorrelations. The sampling distributions can then be approximated non‐parametrically by subsampling, as the corresponding asymptotic distribution is still parameter‐dependent. The subsampling‐based confidence intervals for the process autocovariances and autocorrelations are shown to have satisfactory empirical coverage rates in a simulation study. The impact of subsampling block size on the coverage is assessed. The methodology is further applied to the log‐squared returns of Merck stock.

Collaboration


Dive into the Tucker McElroy's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David F. Findley

United States Census Bureau

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marc Wildi

University of St. Gallen

View shared research outputs
Top Co-Authors

Avatar

Anindya Roy

University of Maryland

View shared research outputs
Top Co-Authors

Avatar

Osbert Pang

United States Census Bureau

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Blakely

United States Census Bureau

View shared research outputs
Researchain Logo
Decentralizing Knowledge