Estela Bee Dagum
University of Bologna
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Estela Bee Dagum.
Econometric Reviews | 2012
Theodore Alexandrov; Silvia Bianconcini; Estela Bee Dagum; Peter Maass; Tucker McElroy
This article presents a review of some modern approaches to trend extraction for one-dimensional time series, which is one of the major tasks of time series analysis. The trend of a time series is usually defined as a smooth additive component which contains information about the time series global change, and we discuss this and other definitions of the trend. We do not aim to review all the novel approaches, but rather to observe the problem from different viewpoints and from different areas of expertise. The article contributes to understanding the concept of a trend and the problem of its extraction. We present an overview of advantages and disadvantages of the approaches under consideration, which are: the model-based approach (MBA), nonparametric linear filtering, singular spectrum analysis (SSA), and wavelets. The MBA assumes the specification of a stochastic time series model, which is usually either an autoregressive integrated moving average (ARIMA) model or a state space model. The nonparametric filtering methods do not require specification of model and are popular because of their simplicity in application. We discuss the Henderson, LOESS, and Hodrick–Prescott filters and their versions derived by exploiting the Reproducing Kernel Hilbert Space methodology. In addition to these prominent approaches, we consider SSA and wavelet methods. SSA is widespread in the geosciences; its algorithm is similar to that of principal components analysis, but SSA is applied to time series. Wavelet methods are the de facto standard for denoising in signal procession, and recent works revealed their potential in trend analysis.
The Statistician | 1978
Estela Bee Dagum
The majority of seasonal adjustment methods, officially adapted by government statistical agencies, belong to the category of techniques based on linear smoothing filters, usually known as moving averages of length 2m+ 1, say. These methods have often been criticized because they lack an explicit model concerning the decomposition of the original series and because their estimates, for observations in the most recent years, do not have the same degree of reliability as those of central observations.
International Statistical Review | 1994
Pierre A. Cholette; Estela Bee Dagum
Summary The Denton method is widely used by statistical agencies to benchmark time series (i.e. to adjust them to annual benchmarks). This method does not take into account: (a) the presence of bias, and (b) the presence of autocorrelation and heteroscedasticity in the survey errors. This paper introduces a regression benchmarking method which incorporates (a) and (b) and calculates the relative efficiency with respect to the Denton method, under the assumption of various types of ARMA processes for the survey errors. The results are illustrated with the Canadian Retail Trade series.
Journal of Business & Economic Statistics | 2008
Estela Bee Dagum; Silvia Bianconcini
The Henderson smoother has been traditionally applied for trend-cycle estimation in the context of nonparametric seasonal adjustment software officially adopted by statistical agencies. This study introduces a Henderson third-order kernel representation by means of the reproducing kernel Hilbert space (RKHS) methodology. Two density functions and corresponding orthonormal polynomials have been calculated. Both are shown to give excellent representations for short- and medium-length filters. Theoretical and empirical comparisons of the Henderson third-order kernel asymmetric filters are made with the classical ones. The former are shown to be superior in terms of signal passing, noise suppression, and revision size.
Journal of Business & Economic Statistics | 1987
Estela Bee Dagum; Normand Laniel
One of the main purposes of the seasonal adjustment of economic time series is to provide information on current economic conditions, particularly to determine the state of the cycle at which the economy stands. Since seasonal adjustment means removing seasonal variations, thus leaving a seasonally adjusted series consisting of trend cycle together with the irregular fluctuations, it is often very difficult to detect cyclical turning points for series strongly contaminated with irregulars. In such cases, it may be preferable to smooth the seasonally adjusted series using trend-cycle filters, which suppress as much as possible the irregulars without affecting the cyclical component. It is inherent, however, in any moving average procedure that the first and last n points of an original series cannot be smoothed with the same symmetric filters applied to middle values. The current and most recent years of data are smoothed by asymmetric filters, which change for each point in time. Hence, as new information...
Journal of the American Statistical Association | 1982
Estela Bee Dagum
Abstract Recent data seasonally adjusted by moving average methods are subject to revisions due to differences in the properties of the linear filters for the same seasonal adjustment when later data become available. This article introduces a measure of the total revision associated with the concurrent and forecasting seasonal filters and applies it to the Census Method II-X-11 variant and the X-11-ARIMA method. To consider the fact that the spectrum of many economic indicators tends to have higher peaks at the lower seasonal frequencies than at the higher, the revision measures are calculated both over all the seasonal frequencies and over selected seasonal frequency intervals.
Econometric Reviews | 2008
Elena Rusticelli; Richard A. Ashley; Estela Bee Dagum; Douglas M. Patterson
Nonconstancy of the bispectrum of a time series has been taken as a measure of non-Gaussianity and nonlinear serial dependence in a stochastic process by Subba Rao and Gabr (1980) and by Hinich (1982), leading to Hinichs statistical test of the null hypothesis of a linear generating mechanism for a time series. Hinichs test has the advantage of focusing directly on nonlinear serial dependence—in contrast to subsequent approaches, which actually test for serial dependence of any kind (nonlinear or linear) on data which have been pre-whitened. The Hinich test tends to have low power, however, and (in common with most statistical procedures in the frequency domain) requires the specification of a smoothing or window-width parameter. In this article, we develop a modification of the Hinich bispectral test which substantially ameliorates both of these problems by the simple expedient of maximizing the test statistic over the feasible values of the smoothing parameter. Monte Carlo simulation results are presented indicating that the new test is well sized and has substantially larger power than the original Hinich test against a number of relevant alternatives; the simulations also indicate that the new test preserves the Hinich tests robustness to misspecifications in the identification of a pre-whitening model.
Linear Algebra and its Applications | 2004
Estela Bee Dagum; Alessandra Luati
Abstract The main purpose of this paper is to introduce a linear transformation, called t , and to derive its algebraic properties by means of permutation matrices that represent it. To demonstrate the importance of the t -transformation for the estimation of latent variables in time series decomposition, we obtain a general expression for smoothing matrices characterized by symmetric and asymmetric weighting systems. We show that the submatrix of the symmetric weights (to be applied to central observations) is t -invariant whereas the submatrices of the asymmetric weights (to be applied to initial and final observations) are the t -transform of each other. By virtue of this relation, the properties of the t -transformation provide useful information on the smoothing of time series data. Finally, we illustrate the role of the t -transformation on the weighting systems of several smoothers often applied for trend-cycle estimation, such as the locally weighted regression smoother (loess), the cubic smoothing spline, the Gaussian kernel and the 13-term trend-cycle Henderson filter.
Journal of the American Statistical Association | 1997
Zhao-Guo Chen; Pierre A. Cholette; Estela Bee Dagum
Abstract This article introduces a nonparametric method to estimate the covariance matrix for the stationary part of the signal (hidden in data), to enable benchmarking via signal extraction. Some discussions and simulations are carried out to compare the proposed benchmarking method to the regression method development by Cholette and Dagum and the signal extraction method developed by Hillmer and Trabelsi suggesting autoregression integrated moving average (ARIMA) models for the signal. The results show that the nonparametric method is feasible, robust, and almost as efficient as the signal extraction method when the true model for the signal is known.
Journal of Econometrics | 1993
Estela Bee Dagum; Benoit Quenneville
Abstract This paper describes a general state space approach for the modelling of the unobserved components; trend cycle, seasonality, trading-day variations, and irregulars of a time series and the calculation of the mean square errors of the estimated unobserved components. The unobserved components models are presented in a state space form. The Kalman filter and fixed interval smoother are applied on the observed series to obtain the estimates of the unobserved components and their corresponding variances. Implementation problems related to the estimation of the initial conditions and the initial values for the variances of both the observation noise and noise processes of the unobserved components are solved using the estimates of the unobserved components from X-11-ARIMA [Dagum (1980)]. The estimation of the signal to noise ratio is made by maximum likelihood using the method of scoring. The MLE of the noise variance is obtained analytically, conditional on the estimates of the signal to noise ratios. Statistical tests to distinguish between alternative models are also provided.