Abstract
In order to improve the generalization performance of support vector regression (SVR), we propose a novel model combination method for SVR on regularization path. First, we construct the initial candidate model set using the regularization path, whose inherent piecewise linearity makes the construction easy and effective. Then, we elaborately select the models for combination from the initial model set through the improved Occam’s Window method and the input-dependent strategy. Finally, we carry out the combination on the selected models using the Bayesian model averaging. Experimental results on benchmark data sets show that our combination method has significant advantage over the model selection methods based on generalized cross validation (GCV) and Bayesian information criterion (BIC). The results also verify that the improved Occam’s Window method and the input-dependent strategy can enhance the predictive performance of the combination model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Statistics and Computing 14(3), 199–222 (2004)
Chang, M.W., Lin, C.J.: Leave-one-out bounds for support vector regression model selection. Neural Computation 17(5), 1188–1222 (2005)
Wang, G., Yeung, D., Lochovsky, F.: Two-dimensional solution path for support vector regression. In: Proceedings of the 23th International Conference on Machine Learning, pp. 993–1000 (2006)
Gunter, L., Zhu, J.: Efficient computation and model selection for the support vector regression. Neural Computation 19(6), 1633–1655 (2007)
Wang, G., Yeung, D.Y., Lochovsky, F.H.: A new solution path algorithm in support vector regression. IEEE Transactions on Neural Networks 19(10), 1753–1767 (2008)
Craven, P., Wahba, G.: Smoothing noisy data with spline functions. Numerische Mathematik 31(4), 377–403 (1978)
Petridis, V., Kehagias, A., Petrou, L., Bakirtzis, A., Kiartzis, S., Panagiotou, H., Maslaris, N.: A bayesian multiple models combination method for time series prediction. Journal of Intelligent and Robotic Systems 31(1), 69–89 (2001)
Freund, Y., Mansour, Y., Schapire, R.E.: Why averaging classifiers can protect against overfitting. In: Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics, vol. 304. Citeseer (2001)
Ji, C., Ma, S.: Combinations of weak classifiers. IEEE Transactions on Neural Networks 8(1), 32–42 (1997)
Raftery, A.E., Madigan, D., Hoeting, J.A.: Bayesian model averaging for linear regression models. Journal of the American Statistical Association 92, 179–191 (1997)
Kittler, J.: Combining classifiers: A theoretical framework. Pattern Analysis and Applications 1, 18–27 (1998)
Evgeniou, T., Pontil, M., Elisseeff, A.: Leave one out error, stability, and generalization of voting combinations of classifiers. Machine Learning 55(1), 71–97 (2004)
Bagui, S.C.: Combining pattern classifiers: methods and algorithms. Technometrics 47(4), 517–518 (2005)
Kimeldorf, G., Wahba, G.: Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and Applications 33(1), 82–95 (1971)
Madigan, D., Raftery, A.E.: Model selection and accounting for model uncertainty in graphical models using Occam’s window. Journal of the American Statistical Association 89(428), 1535–1546 (1994)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer (2008)
Zhao, N., Zhao, Z., Liao, S.: Probabilistic Model Combination for Support Vector Machine Using Positive-Definite Kernel-Based Regularization Path. In: Wang, Y., Li, T. (eds.) ISKE2011. AISC, vol. 122, pp. 201–206. Springer, Heidelberg (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wang, M., Liao, S. (2012). Model Combination for Support Vector Regression via Regularization Path. In: Anthony, P., Ishizuka, M., Lukose, D. (eds) PRICAI 2012: Trends in Artificial Intelligence. PRICAI 2012. Lecture Notes in Computer Science(), vol 7458. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32695-0_57
Download citation
DOI: https://doi.org/10.1007/978-3-642-32695-0_57
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-32694-3
Online ISBN: 978-3-642-32695-0
eBook Packages: Computer ScienceComputer Science (R0)