Statistical Papers

, Volume 60, Issue 1, pp 105–122 | Cite as

Robust second-order least-squares estimation for regression models with autoregressive errors

  • D. RosadiEmail author
  • P. Filzmoser
Regular Article


Rosadi and Peiris (Comput Stat 29:931–943, 2014) applied the second-order least squares estimator (SLS), which was proposed in Wang and Leblanc (Ann Inst of Stat Math 60:883–900, 2008), to regression models with autoregressive errors. In case of autocorrelated errors, it shows that the SLS performs well for estimating the parameters of the model and gives small bias. For less correlated data, the standard error (SE) of the SLS lies between the SE of the ordinary least squares estimator (OLS) and the generalized least squares estimator, however, for more correlated data, the SLS has higher SE than the OLS estimator. In case of a regression model with iid errors, Chen, Tsao and Zhou (Stat Pap 53:371–386, 2012) proposed a method to improve the robustness of the SLS against X-outliers. In this paper, we consider a new robust second-order least squares estimator (RSLS), which extends the study in Chen et al. (2012) to the case of regression with autoregressive errors, and the data may be contaminated with all types of outliers (X-, y-, and innovation outliers). Besides the regression coefficients, here we also propose a robust method to estimate the parameters of the autoregressive errors and the variance of the errors. We evaluate the performance of the RSLS by means of simulation studies. In the simulation study, we consider both a linear and a nonlinear regression model. The results show that the RSLS performs very well. We also provide guidelines to use the RSLS in practice and present a real example.


Robust second-order least squares Regression model Autocorrelated errors Ordinary least squares Generalized least squares 

Mathematics Subject Classification

62J05 62F35 



The financial support for D. Rosadi from OeAD - ASEA-UNINET to initiate this project and Hibah KLN - DIKTI 2016 to revise and to finish the report is also gratefully acknowledged. This work has been done while D. Rosadi was visiting Department of Statistics and Probability Theory, Vienna University of Technology, Austria. The authors would like to thank the anonymous referee and the editor of this journal for their constructive comments and useful suggestions to improve the quality and readability of this manuscript.


  1. Chen X, Tsao M, Zhou J (2012) Robust second-order least-squares estimator for regression models. Stat Pap 53(2):371–386MathSciNetCrossRefzbMATHGoogle Scholar
  2. Filzmoser P, Maronna R, Werner M (2008) Outlier identification in high dimensions. Comput Stat Data Anal 52:1694–1711MathSciNetCrossRefzbMATHGoogle Scholar
  3. Gujarati D (2003) Basic econometrics, 4th edn. McGraw-Hill, SingaporeGoogle Scholar
  4. Maronna A, Martin RD, Yohai VJ (2006) Robust statistics: theory and methods. Wiley, ChicesterCrossRefzbMATHGoogle Scholar
  5. Prais SJ, Winsten CB (1954) Trend estimators and serial correlation. Cowless commission discussion paper, statistics no. 383Google Scholar
  6. R Core Team (2016) R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna.
  7. Rocke D (1996) Robustness properties of S-estimators of multivariate location and shape in high dimension. Ann Stat 24(3):1327–1345MathSciNetCrossRefzbMATHGoogle Scholar
  8. Rosadi D, Peiris S (2014) Second-order least-squares estimation for regression with autocorrelated error. Comput Stat 29:931–943CrossRefzbMATHGoogle Scholar
  9. Rousseeuw PJ, Leroy AM (1987) Robust regression and outlier detection. Wiley, New YorkCrossRefzbMATHGoogle Scholar
  10. Rousseeuw PJ, van Driessen K (1999) A fast algorithm for the minimum covariance determinant estimator. Technometrics 41:212–223CrossRefGoogle Scholar
  11. Wang L, Leblanc A (2008) Second-order nonlinear least squares estimation. Ann Inst Stat Math 60:883–900MathSciNetCrossRefzbMATHGoogle Scholar
  12. Yohai VJ (1987) High breakdown-point and high efficiency robust estimates for regression. Ann Stat 15:642–656MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  1. 1.Department of Mathematics Statistics and Computing Research GroupGadjah Mada UniversityYogyakartaIndonesia
  2. 2.Department of Statistics and Probability TheoryVienna University of TechnologyViennaAustria

Personalised recommendations