Skip to main content
Log in

Robust kernel ensemble regression in diversified kernel space with shared parameters

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Kernel regression is an effective non-parametric regression method. However, such regression methods have problems in choosing an appropriate kernel and its parameters. In this paper, we propose a robust kernel ensemble regression model (RKER) in diversified multiple Reproducing Kernel Hilbert Spaces (RKHSs). Motivated by multi-view data processing, we consider a kernel representation as one view of data and apply this multi-view modeling idea into the kernel regression scenario. The proposed RKER uses an ensemble idea to combine multiple individual regressors into one, where each kernel regressor is associated with a weight that is learned directly from one view of data without manual intervention. Thus, the problem of selecting kernel and its parameter in traditional kernel regression methods is overcome by finding best kernel combinations in diversified multiple solution spaces. With this multi-view modeling, RKER results in a superior overall regression performance and more robust in parameter selection. Further, we can learn the parameters in multiple RKHSs with individual specific and shared structures. Experimental results on Abalone and FaceBook datasets demonstrate that our proposed RKER model shows best performance among other state-of-the-art regression and ensemble methods, such as Random Forest, Gradient Boosting Regressor and eXtreme Gradient Boosting.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Moreno-Salinas D, Moreno R, Pereira A, Aranda J, De la Cruz JM (2020) Modelling of a surface marine vehicle with kernel ridge regression confidence machine, Applied Soft Computing

  2. Yang S, Wane M, Jiao L (2007) Ridgelet kernel regression. Neurocomputing 70 (16-18):3046–3055

    Article  Google Scholar 

  3. Exterkate Peter (2013) Model selection in kernel ridge regression. Computat Stat Data Anal 68:1–16

    Article  MathSciNet  MATH  Google Scholar 

  4. Samah HA, Isa NAM, Toh KKV (2020) Automatic false edge elimination using locally adaptive regression kernel. Signal, Image & Video Processing

  5. Salhov M, Lindenbaum O, Aizenbud Y, Silberschatz A, Shkolnisky Y, Averbuch A (2020) Multi-view kernel consensus for data analysis

  6. Berikov V, Vinogradova T (2018) Regression analysis with cluster ensemble and kernel function. In: International conference on analysis of images, social networks and texts

  7. Li Y, Fang-Xiang W, Alioune N (2018) A review on machine learning principles for multi-view biological data integration. Briefings in Bioinformatics (2). accpted

  8. Singh VK, Kumar V, Krishnamachari A (2017) Prediction of replication sites in saccharomyces cerevisiae genome using dna segment properties: Multi-view ensemble learning (mel) approach. Bio Systems, 59

  9. Wang Q, Guo Y, Wang J, Luo X, Kong X (2018) Multi-view analysis dictionary learning for image classification. IEEE Access, 1–1

  10. Meng Y, Zhang D, Wang S (2012) Relaxed collaborative representation for pattern classification. In: IEEE Conference on computer vision & pattern recognition

  11. Li J, Zhang B, Zhang D (2017) Shared autoencoder gaussian process latent variable model for visual classification. IEEE Transactions on Neural Networks & Learning Systems, 1– 15

  12. Kase N, Babaee M, Rigoll G (2017) Multi-view human activity recognition using motion frequency. In: 2017 IEEE International conference on image processing (ICIP)

  13. Garcia-Ceja E, Galván-Tejada C, Brena R (2017) Multi-view stacking for activity recognition with sound and accelerometer data. Information Fusion 40:45–56

    Article  Google Scholar 

  14. Shifen C, Peng F, Sheng P (2018) A spatiotemporal multi-view-based learning method for short-term traffic forecasting. ISPRS International Journal of Geo-Information 7(6):2220–9964

    Google Scholar 

  15. Li Y, Fang-Xiang W, Alioune N (2018) A review on machine learning principles for multi-view biological data integration. Brief Bioinform 19(2):325–340

    Google Scholar 

  16. Salhov M, Lindenbaum O, Aizenbud Y, Silberschatz A, Shkolnisky Y, Averbuch A (2018) Multi-view kernel consensus for data analysis. Analysis of Images, Social Networks and Texts

  17. Ferraro MB, Colubi A, Gonzalez-Rodriguez G, Coppi R (2011) A determination coefficient for a linear regression model with imprecise response. Environmetrics 22(4):516–529

    Article  MathSciNet  MATH  Google Scholar 

  18. Puntanen S (2010) Linear regression analysis: Theory and computing. International Statistical Review 78(1):144

    Article  MathSciNet  Google Scholar 

  19. Shi Y, Li J, Li Z (2019) Gradient boosting with piece-wise linear regression trees. In: Proceedings of the twenty-eighth international joint conference on artificial intelligence, IJCAI-19, International joint conferences on artificial intelligence organization, pp 3432–3438

  20. Shih-Ming H, Jar-Ferr Y (2020) Linear discriminant regression classification for face recognition, Signal Processing Letters IEEE

  21. Huang SM, Yang JF (2013) Unitary regression classification with total minimum projection error for face recognition. IEEE Signal Processing Letters 20(5):443–446

    Article  Google Scholar 

  22. Koç M, Barkana A (2020) Application of linear regression classification to low-dimensional datasets, Neurocomputing

  23. Bootkrajang J, Kabán A (2014) Learning kernel logistic regression in the presence of class label noise. Pattern Recogn 47(11):3641–3655

    Article  MATH  Google Scholar 

  24. Nataraj G, Nielsen JF, Scott C, Fessler JA (2017) Dictionary-free mri perk: Parameter estimation via regression with kernels. IEEE Trans Med Imaging, 1–1

  25. Unified heat kernel regression for diffusion, kernel smoothing and wavelets on manifolds and its application to mandible growth modeling in ct images. Med Image Anal 22(1):63–76 (2015)

  26. Dong XM, Gu YH, Shi J, Xiang K (2020) Random multi-scale kernel-based bayesian distribution regression learning. Knowledge-Based Systems 201-202:106073

    Article  Google Scholar 

  27. Widmer C (2020) Multitask multiple kernel learning (mt-mkl)

  28. Danafar S, Fukumizu K, Gomez F (2020) Kernel-based information criterion, Computer & Information Science 8 (1)

  29. Tsymbal MA, Cunningham D (2004) Diversity in random subspacing ensembles, Ph.D. thesis. Springer, Berlin

    Google Scholar 

  30. Li L, Hu Q, Wu X, Yu D (2014) Exploration of classification confidence in ensemble learning. Pattern Recogn 47(9):3120–3131

    Article  Google Scholar 

  31. Jurek A, Bi Y, Wu S, Nugent C (2014) A survey of commonly used ensemble-based classification techniques. Knowl Eng Rev 29(5):551–581

    Article  Google Scholar 

  32. Musikawan P, Sunat K, Kongsorot Y, Horata P, Chiewchanwattana S (2019) Parallelized metaheuristic-ensemble of heterogeneous feedforward neural networks for regression problems. IEEE Access 7(1):26909–26932

    Article  Google Scholar 

  33. Chiroma H, Abdul-Kareem S, Gital AY, Muaz SA, Herawan T (2015) An ensemble crt, rvfln, svm method for estimating propane spot price. Advances in Intelligent Systems and Computing 331:21–30

    Article  Google Scholar 

  34. Tong H, Chen DR, Yang F (2012) Full length article: Support vector machines regression with l1-regularizer. Journal of Approximation Theory 164(10):1331–1344

    Article  MathSciNet  MATH  Google Scholar 

  35. Kang S, Kang P (2018) Locally linear ensemble for regression. Inf Sci 432:199–209

    Article  MathSciNet  MATH  Google Scholar 

  36. Utkin LV, Wiencierz A (2015) Improving over-fitting in ensemble regression by imprecise probabilities. Inf Sci 317:315–328

    Article  Google Scholar 

  37. Ueda N, Nakano R (2002) Generalization error of ensemble estimators. In: IEEE International Conference on Neural Networks

  38. Brown G, Wyatt J, Harris R, Xin Y (2005) Diversity creation methods: a survey and categorisation. Information Fusion 6(1):5–20

    Article  Google Scholar 

  39. Salim A, Shiju SS, Sumitra S (2020) Design of multi-view graph embedding using multiple kernel learning, Engineering Applications of Artificial Intelligence 90

  40. Wu D, Wang B, Precup D, Boulet B (2019) Multiple kernel learning-based transfer regression for electric load forecasting. IEEE Transactions on Smart Grid PP(99):1–1

    Google Scholar 

  41. Wang T, Su H, Li J (2020) Dws-mkl: Depth-width-scaling multiple kernel learning for data classification, Neurocomputing 411(18)

  42. Xu YL, Li XX, Chen DR, Li HX (2018) Learning rates of regularized regression with multiple gaussian kernels for multi-task learning. IEEE Transactions on Neural Networks and Learning Systems, 1–11

  43. Wang CL, Li C, Wang J (2017) Two modified augmented lagrange multiplier algorithms for toeplitz matrix compressive recovery, Computers & Mathematics with Applications, S0898122117304145

  44. Wu H, Cai Y, Wu Y, Zhong R, Li Q, Zheng J, Lin D, Li Y (2017) Time series analysis of weekly influenza-like illness rate using a one-year period of factors in random forest regression. 11(3): 292

  45. Wang Y, Feng D, Li D, Chen X, Xin N (2016) A mobile recommendation system based on logistic regression and gradient boosting decision trees. In: 2016 International joint conference on neural networks (IJCNN)

  46. Chen T, Guestrin C (2016) Xgboost: a scalable tree boosting system. In: ACM

  47. Zhou ZH, Feng J (2017) Deep forest: Towards an alternative to deep neural networks, Proceedings of the Twenty-Sixth International Join&t Conference on Artificial Intelligence (IJCAI-2017) Melbourne Australia

  48. Belhumeur PN, Hespanha JP, Kriegman DJ (1997) Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Trans Pattern Anal Mach Intell 19(7):711–720

    Article  Google Scholar 

  49. Chen H, Zhang H, Boning D, Hsieh CJ (2019) Robust decision trees against adversarial examples. Statistics

  50. Nagarajan SM, Muthukumaran V, Murugesan R, Joseph RB, Munirathanam M (2020) Feature selection model for healthcare analysis and classification using classifier ensemble technique, International Journal of Systems Assurance Engineering and Management

  51. Cg A, At A, Mt B (2019) Koc+: Kernel ridge regression based one-class classification using privileged information. Inf Sci 504:324–333

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This research was funded in part by Primary Research & Development Plan of Jiangsu Province (BE2018627)

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Xiang-Jun Shen or Yu-bao Cui.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Zf., Chen, L., Mehta, S. et al. Robust kernel ensemble regression in diversified kernel space with shared parameters. Appl Intell 53, 1051–1067 (2023). https://doi.org/10.1007/s10489-022-03492-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03492-6

Keywords

Navigation