Skip to main content
Log in

Advanced algorithms for penalized quantile and composite quantile regression

  • Original paper
  • Published:
Computational Statistics Aims and scope Submit manuscript

Abstract

In this paper, we discuss a family of robust, high-dimensional regression models for quantile and composite quantile regression, both with and without an adaptive lasso penalty for variable selection. We reformulate these quantile regression problems and obtain estimators by applying the alternating direction method of multipliers (ADMM), majorize-minimization (MM), and coordinate descent (CD) algorithms. Our new approaches address the lack of publicly available methods for (composite) quantile regression, especially for high-dimensional data, both with and without regularization. Through simulation studies, we demonstrate the need for different algorithms applicable to a variety of data settings, which we implement in the cqrReg package for R. For comparison, we also introduce the widely used interior point (IP) formulation and test our methods against the IP algorithms in the existing quantreg package. Our simulation studies show that each of our methods, particularly MM and CD, excel in different settings such as with large or high-dimensional data sets, respectively, and outperform the methods currently implemented in quantreg. The ADMM approach offers specific promise for future developments in its amenability to parallelization and scalability.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  • Boyd S, Parikh N, Chu E, Eckstein J (2011) Distributed optimization and statistical learning via the alternating direction method of multipliers. Found Trends Mach Learn 3(1):1–122

    Article  MATH  Google Scholar 

  • Chen C, Wei Y (2005) Computational issues for quantile regression. Sankhy\(\bar{a}\). Indian J Stat 67(2):399–417

    Google Scholar 

  • Dempster A, Laird N, Rubin D (1976) Maximum likelihood from incomplete data via the EM algorithm. J R Stat Soc: Ser B (Methodol) 39(1):1–38

    MathSciNet  MATH  Google Scholar 

  • Eddelbuettel D, François R (2011) Rcpp: seamless R and C++ integration. J Stat Softw 40(8):1–18

    Article  Google Scholar 

  • Eddelbuettel D, Sanderson C (2014) RcppArmadillo: accelerating R with high-performance C++ linear algebra. Comput Stat Data Anal 71:1054–1063

    Article  MathSciNet  MATH  Google Scholar 

  • Fan J, Li R (2001) Variable selection via nonconcave penalized likelihood and its oracle properties. J Am Stat Assoc 96(1):1348–1360

    Article  MathSciNet  MATH  Google Scholar 

  • Friedman J, Hastie T, Höfling H, Tibshirani R (2007) Pathwise coordinate optimization. Ann Appl Stat 1(2):302–332

    Article  MathSciNet  MATH  Google Scholar 

  • Friedman J, Hastie T, Tibshirani R (2010) Regularization paths for generalized linear models via coordinate descent. J Stat Softw 31(1):1–22

    Google Scholar 

  • Gabay D, Mercier B (1976) A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Comp Math Appl 2(1):17–40

    Article  MATH  Google Scholar 

  • Gao J, Kong L (2015) cqrReg: Quantile, composite quantile regression and regularized versions. https://CRAN.R-project.org/package=cqrReg, R package version 1.2. Accessed 2017

  • Gu Y, Fan J, Kong L, Ma S, Zou H (2018) ADMM for high-dimensional sparse penalized quantile regression. Technometrics 60(3):319–331

    Article  MathSciNet  Google Scholar 

  • He Q, Kong L, Wang Y, Wang S, Chan T, Holland E (2016) Regularized quantile regression under heterogeneous sparsity with application to quantitative genetic traits. Comput Stat Data Anal 95:222–239

    Article  MathSciNet  MATH  Google Scholar 

  • Hestenes M (1969) Multiplier and gradient methods. J Optim Theory Appl 4(5):303–320

    Article  MathSciNet  MATH  Google Scholar 

  • Hunter D, Lange K (2000) Quantile regression via an MM algorithm. J Comput Gr Stat 9(1):60–77

    MathSciNet  Google Scholar 

  • Hunter D, Lange K (2004) A tutorial on MM algorithms. Am Stat 58(1):30–37

    Article  MathSciNet  Google Scholar 

  • Hunter D, Li R (2005) Variable selection using MM algorithms. Ann Stat 33(4):1617–1642

    Article  MathSciNet  MATH  Google Scholar 

  • Kai B, Li R, Zou H (2010) Local composite quantile regression smoothing: an efficient and safe alternative to local polynomial regression. J R Stat Soc: Ser B (Stat Methodol) 72(1):49–69

    Article  MathSciNet  MATH  Google Scholar 

  • Koenker R (2005) Quantile regression. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  • Koenker R (2017) quantreg: Quantile regression. https://CRAN.R-project.org/package=quantreg, R package version 5.33. Accessed 2017

  • Koenker R, Bassett G (1978) Regression quantiles. Econometrica 46(1):33–50

    Article  MathSciNet  MATH  Google Scholar 

  • Koenker R, Chernozhukov V, He X, Peng L (2018) Handbook of quantile regression. CRC Press, Boca Raton

    MATH  Google Scholar 

  • Kong L, Shu H, Heo G, He QC (2015) Estimation for bivariate quantile varying coefficient model. arXiv:1511.02552

  • Li D, Li R (2016) Local composite quantile regression smoothing for Harris recurrent Markov processes. J Econom 194(1):44–56

    Article  MathSciNet  MATH  Google Scholar 

  • Lin Z, Chen M, Ma Y (2010) The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrices. arXiv:1109.0367

  • Luo ZQ, Tseng P (1992) On the convergence of the coordinate descent method for convex differentiable minimization. J Optim Theory Appl 72(1):7–35

    Article  MathSciNet  MATH  Google Scholar 

  • Mehrotra S (1992) On the implementation of a primal-dual interior point method. SIAM J Optim 2(4):575–601

    Article  MathSciNet  MATH  Google Scholar 

  • Ortega J, Rheinboldt W (1970) Iterative solution of nonlinear equations in several variables. Academic Press, New York and London

    MATH  Google Scholar 

  • Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc: Ser B (Methodol) 58(1):267–288

    MathSciNet  MATH  Google Scholar 

  • Tseng P (2001) Convergence of a block coordinate descent method for nondifferentiable minimization. J Optim Theory Appl 109(3):475–494

    Article  MathSciNet  MATH  Google Scholar 

  • Vidaurre D, Bielza C, Larrañaga P (2013) A survey of \({L}_1\) regression. Int Stat Rev 81(3):361–387

    Article  MathSciNet  Google Scholar 

  • Wu T, Lange K (2008) Coordinate descent algorithms for lasso penalized regression. Ann Appl Stat 2(1):224–244

    Article  MathSciNet  MATH  Google Scholar 

  • Wu Y, Liu Y (2009) Variable selection in quantile regression. Stat Sin 19(2):801–817

    MathSciNet  MATH  Google Scholar 

  • Xu Q, Deng K, Jiang C, Sun F, Huang X (2017) Composite quantile regression neural network with applications. Expert Syst Appl 76:129–139

    Article  Google Scholar 

  • Yu L, Lin N (2017) ADMM for penalized quantile regression in big data. Int Stat Rev 85(3):494–518

    Article  MathSciNet  Google Scholar 

  • Zhang L, Yu D, Mizera I, Jiang B, Kong L (2017) Sparse wavelet estimation in quantile regression with multiple functional predictors. arXiv:1706.02353

  • Zou H (2006) The adaptive lasso and its oracle properties. J Am Stat Assoc 101:1418–1429

    Article  MathSciNet  MATH  Google Scholar 

  • Zou H, Yuan M (2008) Composite quantile regression and the oracle model selection theory. Ann Stat 36(3):1108–1126

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

Jueyu Gao acknowledges the supervision of Drs. Linglong Kong and Edit Gombay during his graduate studies. The authors have no declarations of interest to declare.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Linglong Kong.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Drs. Linglong Kong, Bei Jiang, and Di Niu are supported in part by the Natural Sciences and Engineering Research Council of Canada (NSERC).

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 211 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pietrosanu, M., Gao, J., Kong, L. et al. Advanced algorithms for penalized quantile and composite quantile regression. Comput Stat 36, 333–346 (2021). https://doi.org/10.1007/s00180-020-01010-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00180-020-01010-1

Keywords

Navigation