Abstract
In literature, the idea of kernel machine was introduced to quantile regression, resulting kernel quantile regression (KQR) model, which is capable to fit nonlinear models with flexibility. However, the formulation of KQR leads to a quadratic programming which is computationally expensive to solve. This paper proposes a fast training algorithm for KQR based on majorization-minimization approach, in which an upper bound for the objective function is derived in each iteration which is easier to be minimized. The proposed approach is easy to implement, without requiring any special computing package other than basic linear algebra operations. Numerical studies on simulated and real-world datasets show that, compared to the original quadratic programming based KQR, the proposed approach can achieve essentially the same prediction accuracy with substantially higher time efficiency in training.
This is a preview of subscription content, access via your institution.
Notes
In principle, the parameters C and \(\sigma \) are often selected by certain model selection criteria, such as cross validation. However, cross validation would take too long time since QP-KQR usually spends long time for each run, as shown in our results. This is another reason for us to fix the parameters.
References
Anand, P., Rastogi, R., & Chandra, S. (2019a). A \(\nu \)-support vector quantile regression model with automatic accuracy control. arXiv:1910.09168.
Anand, P., Rastogi, R., & Chandra, S. (2019b). A new asymmetric \(\epsilon \)-insensitive pinball loss function based support vector quantile regression model. arXiv: 1908.06923.
Bang, S., Eo, S.-H., Jhun, M., & Cho, H. J. (2017). Composite kernel quantile regression. Communications in Statistics–Simulation and Computation, 46(3), 2228–2240.
Boyd, S., Parikh, N., Chu, E., & Eckstein, J. (2011). Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3(1), 1–122.
Chang, C. C., & Lin, C. J. (2011). LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2(3), 27:1-27:27.
Crambes, C., Gannoun, A., & Henchiri, Y. (2013). Support vector machine quantile regression approach for functional data: Simulation and application studies. Journal of Multivariate Analysis, 121, 50–68.
Gunn, S.R. (1997). Support vector machines for classification and regression. Technical Report, Image Speech and Intelligent Systems Research Group, University of Southampton, available at http://users.ecs.soton.ac.uk/srg/publications/pdf/SVM.pdf.
Huang, X., Shi, L., & Suykens, J. (2014). Support vector machine classifier with pinball loss. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36, 984–997.
Hunter, D. R., & Lange, K. (2000). Quantile regression via an MM algorithm. Journal of Computational and Graphical Statistics, 9(1), 60–77.
Hunter, D. R., & Lange, K. (2004). A tutorial on MM algorithms. The American Statistician, 58(1), 30–37.
Hwang, C., & Shim, J. (2005). A simple quantile regression via support vector machine. Lecture Notes in Computer Science, 3610(2005), 512–520.
Hwang, C., & Shim, J. (2010). Support vector quantile regression with weighted quadratic loss function. Communications of the Korean Statistical Society, 17(2), 183–191.
Joachims, J. (1999). Making large-scale SVM learning practical. In B. Schölkopf, C. Burges, & A. Smola (Eds.), Advances in Kernel Methods–Support Vector Learning. London: MIT Press.
Kimeldorf, G. S., & Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and Applications, 33(1), 82–95.
Koenker, R. (2005). Quantile Regression. New York: Cambridge University Press.
Koenker, R., & Bassett, G. (1978). Regression quantiles. Econometrica, 46, 33–50.
Li, Y., Liu, Y., & Zhu, J. (2007). Quantile regression in reproducing kernel hilbert spaces. Journal of the American Statistical Association, 102, 255–268.
Nguyen, H. D. (2017). An introduction to MM algorithms for machine learning and statistical estimation. WIREs Data Mining and Knowledge Discovery, 7(2), (e1198).
Osuna, E., Freund, R., & Girosi, F. (1997a). An Improved Training Algorithm for Support Vector Machines. In: Proceedings IEEE Workshop Neural Networks for Signal Processing, pp. 276–285.
Osuna, E., Freund, R., & Girosi, F. (1997b). Training Support Vector Machines: an Application to Face Detection. In: Proceedings of IEEE Conf. on Computer Vision and Pattern Recognition, 1997.
Pietrosanu, M., Gao, J., Kong, L., Jiang, B., & Niu, D. (2021). Advanced algorithms for penalized quantile and composite quantile regression. Computational Statistics, 36, 333–346.
Platt, J. (1998). Fast Training of support vector machines using sequential minimal optimization. In In. B. Schölkopf, C. Burges, & A. Smola (Eds.), Advances in Kernel Methods—Support Vector Learning. Cambridge: MIT Press.
Schölkopf, B., & Smola, A. (2002). Learning with Kernels. Cambridge, MA: MIT Press.
Schölkopf, B., Smola, A., Williamson, R. C., & Bartlett, P. L. (2000). New support vector algorithms. Neural Computation, 12, 1207–1245.
Shim, J., Hwang, C., & Seok, K. (2016). Support vector quantile regression with varying coefficients. Computational Statistics, 31(3), 1015–1030.
Shim, J., Seok, K., & Hwang, C. (2017). Monotone support vector quantile regression. Communications in Statistics—Theory and Methods, 46(10), 5180–5193.
Shim, J., Seok, K. H., Hwang, C., & Cho, D. (2011). Support vector quantile regression using asymmetric e-insensitive loss function. Communications of the Korean Statistical Society, 18(2), 165–170.
Smola, A. J., & Schölkopf, B. (2004). A tutorial on support vector regression. Statistics and Computing, 14(3), 199–222.
Sohn, I., Kim, S., Hwang, C., & Lee, J. W. (2008). New normalization methods using support vector machine quantile regression approach in microarray analysis. Computational Statistics & Data Analysis, 52(8), 4104–4115.
Sohn, I., Kim, S., Hwang, C., Lee, J. W., & Shim, J. (2008). Support vector machine quantile regression for detecting differentially expressed genes in microarray analysis. Methods of Information in Medicine, 47(5), 459–467.
Sun, Y., Babu, P., & Palomar, D. P. (2017). Majorization-minimization algorithms in signal processing, communications, and machine learning. IEEE Transactions on Signal Processing, 65(3), 794–816.
Takeuchi, I., Le, Q. V., Sears, T. D., & Smola, A. J. (2006). Nonparametric quantile estimation. Journal of Machine Learning Research, 7, 1231–1264.
Vapnik, V. (1998). Statistical Learning Theory. New York: Wiley.
Xu, Q., Zhang, J., Jiang, C., Huang, X., & He, Y. (2015). Weighted quantile regression via support vector machine. Expert Systems with Applications, 42(13), 5441–5451.
Yang, Y., Zhang, T., & Zou, H. (2018). Flexible expectile regression in reproducing kernel Hilbert space. Technometrics, 60(1), 26–35.
Yuan, M. (2006). GACV for quantile smoothing splines. Computational Statistics & Data Analysis, 50(3), 813–829.
Zou, H., & Yuan, M. (2008). Composite quantile regression and the oracle model selection theory. Annals of Statistics, 36(3), 1108–1126.
Acknowledgements
The author would like to extend his sincere gratitude to the anonymous reviewers for their constructive suggestions and comments, which have greatly helped improve the quality of this paper. This work was supported by a Summer Faculty Fellowship from Missouri State University.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Zheng, S. Fast quantile regression in reproducing kernel Hilbert space. J. Korean Stat. Soc. 51, 568–588 (2022). https://doi.org/10.1007/s42952-021-00154-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s42952-021-00154-0
Keywords
- Kernel quantile regression
- Quadratic programming
- MM-algorithm