Computational Statistics

, Volume 32, Issue 4, pp 1375–1393 | Cite as

An effective method to reduce the computational complexity of composite quantile regression

Original Paper
  • 161 Downloads

Abstract

In this article, we aim to reduce the computational complexity of the recently proposed composite quantile regression (CQR). We propose a new regression method called infinitely composite quantile regression (ICQR) to avoid the determination of the number of uniform quantile positions. Unlike the composite quantile regression, our proposed ICQR method allows combining continuous and infinite quantile positions. We show that the proposed ICQR criterion can be readily transformed into a linear programming problem. Furthermore, the computing time of the ICQR estimate is far less than that of the CQR, though it is slightly larger than that of the quantile regression. The oracle properties of the penalized ICQR are also provided. The simulations are conducted to compare different estimators. A real data analysis is used to illustrate the performance.

Keywords

Quantile regression Composite quantile regression Computational complexity Linear programming Dual problem 

Notes

Acknowledgements

The work was partially supported by the major research Projects of philosophy and social science of the Chinese Ministry of Education (No. 15JZD015), National Natural Science Foundation of China (No. 11271368), Project supported by the Major Program of Beijing Philosophy and Social Science Foundation of China (No. 15ZDA17), Project of Ministry of Education supported by the Specialized Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20130004110007), the Key Program of National Philosophy and Social Science Foundation Grant (No. 13AZD064), the Major Project of Humanities Social Science Foundation of Ministry of Education (No. 15JJD910001), the Fundamental Research Funds for the Central Universities, and the Research Funds of Renmin University of China (No. 15XNL008)

References

  1. Fan J, Li R (2001) Variable selection via nonconcave penalized likelihood and its oracle properties. J Am Stat Assoc 96(456):1348–1360CrossRefMATHMathSciNetGoogle Scholar
  2. Hunter DR, Lange K (2000) Quantile regression via an MM algorithm. J Comput Graph Stat 9(1):60–77MathSciNetGoogle Scholar
  3. Jiang R, Qian W (2013) Composite quantile regression for nonparametric model with random censored data. Open J Stat 3:65–73CrossRefGoogle Scholar
  4. Jiang X, Jiang J, Song X (2012) Oracle model selection for nonloinear models based on weight composite quantile regression. Stat Sin 22:1479–1506MATHGoogle Scholar
  5. Kai B, Li R (2010) Local composite quantile regression smoothing: an efficient and safe alternative to local polynomial regression. J R Stat Soc 72:49–69CrossRefMathSciNetGoogle Scholar
  6. Koenker R (1984) A note on L-estimates for linear models. Stat Probab Lett 2(6):323–325CrossRefMATHMathSciNetGoogle Scholar
  7. Koenker R (2005) Quantile regression, vol 38. Cambridge University Press, CambridgeCrossRefMATHGoogle Scholar
  8. Koenker R, Bassett Jr G (1978) Regression quantiles. Econometrica 46(1):33–50Google Scholar
  9. Kozumi H, Kobayashi G (2011) Gibbs sampling methods for Bayesian quantile regression. J Stat Comput Simul 81(11):1565–1578CrossRefMATHMathSciNetGoogle Scholar
  10. Koenker R, Ng P, Portnoy S (1994) Quantile smoothing splines. Biometrika 81(4):673–680CrossRefMATHMathSciNetGoogle Scholar
  11. Reich BJ, Smith LB (2013) Bayesian quantile regression for censored data. Biometrics 69(3):651–660CrossRefMATHMathSciNetGoogle Scholar
  12. Tian M (2006) A quantile regression analysis of family background factor effects on mathematical achievement. J Data Sci 4(4):461–478Google Scholar
  13. Tian M, Chen G (2006) Hierarchical linear regression models for conditional quantiles. Sci China Ser A Math 49(12):1800–1815CrossRefMATHMathSciNetGoogle Scholar
  14. Tian Y, Tian M, Zhu Q (2014) Linear quantile regression based on EM algorithm. Commun Stat Theory Methods 43(16):3464–3484CrossRefMATHMathSciNetGoogle Scholar
  15. Tibshirani R (1996) Regression shrinkage and selection via the LASSO. J R Stat Soc B (Methodol) 58(1):267–288Google Scholar
  16. Tsionas EG (2003) Bayesian quantile inference. J Stat Comput Simul 73(9):659–674CrossRefMATHMathSciNetGoogle Scholar
  17. Yu K, Moyeed RA (2001) Bayesian quantile regression. Stat Probab Lett 54(4):437–447CrossRefMATHMathSciNetGoogle Scholar
  18. Zou H (2006) The adaptive LASSO and its oracle properties. J Am Stat Assoc 101(476):1418–1429CrossRefMATHMathSciNetGoogle Scholar
  19. Zou H, Yuan M (2008) Composite quantile regression and the oracle model selection theory. Ann Stat 36(3):1108–1126Google Scholar

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1.College of ScienceGuangdong Ocean UniversityZhanjiangChina
  2. 2.Center for Applied Statistics, School of StatisticsRenmin University of ChinaBeijingChina
  3. 3.School of StatisticsLanzhou University of Finance and EconomicsLanzhouChina
  4. 4.School of Statistics and InformationXinjiang University of Finance and EconomicsÜrümqiChina

Personalised recommendations