Abstract
This chapter is not aimed at replacing literature on introduction to kernel methods or Fisher kernels. There are some excellent text books and tutorials on the topic by Schölkopf and Smola (Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT Press (2002), [1]), Shawe-Taylor, Cristianini (Kernel methods for pattern analysis. Cambridge University Press (2004), [2]), Kung (Kernel methods and machine learning. Cambridge University Press, Princeton University (2014), [3])). In contrast to formal theory and proofs, this chapter briefly describes the evolution of kernel methods and the heuristics and methods that have helped kernel methods evolve over the past many years for solving the challenges faced by current machine learning practitioners and applied scientists.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Schölkopf, B., Smola, A.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, USA (2002)
Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, New York (2004)
Kung, S.: Kernel Methods and Machine Learning. Cambridge University Press, Princeton University, New Jersey, USA (2014)
Minsky, M., Papert, S.: Perceptrons: An Introduction to Computational Geometry. MIT Press, USA (2017)
Bach, F., Lanckriet, G., Jordan, M.: Multiple kernel learning, conic duality, and the SMO algorithm. In: Proceedings of the Twenty-First International Conference on Machine Learning (2004)
Sonnenburg, S., Rätsch, G., Schäfer, C.: A General and efficient multiple kernel learning algorithm. In: Advances in Neural Information Processing Systems, pp. 1273–1280 (2006)
Sonnenburg, S., Rätsch, G., Schäfer, C., et al.: Large scale multiple kernel learning. J. Mach. Learn. Res. 7, 1531–1565 (2006)
Xu, Z., Jin, R., King, I., Lyu, M.: An extended level method for efficient multiple kernel learning. In: Advances in Neural Information Processing Systems, pp. 1825–1832 (2009)
Rakotomamonjy, A., Bach, F., Canu, S., et al.: More efficiency in multiple kernel learning. In: Proceedings of the 24th International Conference on Machine Learning, pp. 775–782. ACM (2007)
Gehler, P., Nowozin, S.: Infinite kernel learning. Technical report, TR-178 (2008)
Gönen, M., Alpaydin, E.: Localized multiple kernel learning. In: Proceedings of the 25th International Conference on Machine Learning, pp. 352–359. ACM (2008)
Varma, M., Babu, R.: More generality in efficient multiple kernel learning. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 1065–1072. ACM (2009)
Cho, Y., Saul, L.: Kernel methods for deep learning. In: Advances in Neural Information Processing Systems, pp. 342–350 (2009)
Zhuang, J., Tsang, I., et al.: Two-layer multiple kernel learning. In: Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, pp. 909–917 (2011)
Lu, Z., May, A., Liu, K., et al.: How to scale up kernel methods to be as good as deep neural nets (2014). arXiv preprint arXiv:1411.4000
Wu, J., Zheng, W., Lai, J.: Approximate kernel competitive learning. Neural Netw. 63, 117–132 (2015)
Pourkamali, F., Becker, S.: Randomized clustered nystrom for large-scale kernel machines (2016). arXiv preprint arXiv:1612.06470
Rahimi, A., Recht, B.: Random features for large-scale kernel machines. In: Advances in Neural Information Processing Systems, pp. 1177–1184 (2008)
Rudi, A., Carratino, L., Rosasco, L.: FALKON: An optimal large scale kernel method. In: Advances in Neural Information Processing Systems, pp. 3891–3901 (2017)
Franc, V., Sonnenburg, S.: Optimized cutting plane algorithm for large-scale risk minimization. J. Mach. Learn. Res. 2157–2192 (2009)
Hsieh, C., Chang, K., Lin, C., et al.: A dual coordinate descent method for large-scale linear SVM. In: Proceedings of the 25th International Conference on Machine Learning, pp. 408–415. ACM (2008)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2018 The Author(s), under exclusive licence to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Azim, T., Ahmed, S. (2018). Kernel Based Learning: A Pragmatic Approach in the Face of New Challenges. In: Composing Fisher Kernels from Deep Neural Models. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-98524-4_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-98524-4_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-98523-7
Online ISBN: 978-3-319-98524-4
eBook Packages: Computer ScienceComputer Science (R0)