Statistical Papers | 2019

Robust second-order least-squares estimation for regression models with autoregressive errors

 
 

Abstract


Rosadi and Peiris (Comput Stat 29:931–943, 2014) applied the second-order least squares estimator (SLS), which was proposed in Wang and Leblanc (Ann Inst of Stat Math 60:883–900, 2008), to regression models with autoregressive errors. In case of autocorrelated errors, it shows that the SLS performs well for estimating the parameters of the model and gives small bias. For less correlated data, the standard error (SE) of the SLS lies between the SE of the ordinary least squares estimator (OLS) and the generalized least squares estimator, however, for more correlated data, the SLS has higher SE than the OLS estimator. In case of a regression model with iid errors, Chen, Tsao and Zhou (Stat Pap 53:371–386, 2012) proposed a method to improve the robustness of the SLS against X-outliers. In this paper, we consider a new robust second-order least squares estimator (RSLS), which extends the study in Chen et\xa0al. (2012) to the case of regression with autoregressive errors, and the data may be contaminated with all types of outliers (X-, y-, and innovation outliers). Besides the regression coefficients, here we also propose a robust method to estimate the parameters of the autoregressive errors and the variance of the errors. We evaluate the performance of the RSLS by means of simulation studies. In the simulation study, we consider both a linear and a nonlinear regression model. The results show that the RSLS performs very well. We also provide guidelines to use the RSLS in practice and present a real example.

Volume 60
Pages 105-122
DOI 10.1007/S00362-016-0829-9
Language English
Journal Statistical Papers

Full Text