Skip to main content

Open Source Knowledge Base for Machine Learning Practitioners

  • Chapter
  • First Online:
Composing Fisher Kernels from Deep Neural Models

Part of the book series: SpringerBriefs in Computer Science ((BRIEFSCOMPUTER))

  • 533 Accesses

Abstract

In this chapter, we provide references to some of the most useful resources that could provide practitioners a quick start for learning and implementing a variety of deep learning models, kernel functions, Fisher vector encodings and feature condensation techniques. Not only can the users benefit from the open source codes, a rich collection of benchmark data sets and tutorials can provide them all the details to get hands on experience of the techniques discussed in this book. We have shared comparative analysis of the resources in tabular form so that users could pick the tools keeping in view their programming expertise, software/hardware dependencies and productivity goals.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. LeCun, Y.: The MNIST database of handwritten digits (1998). http://yann.lecun.com/exdb/mnist/

  2. Lecun, Y.: USPS dataset. http://www.cad.zju.edu.cn/home/dengcai/Data/MLData.html

  3. Netzer, Y., Wang, T., Coates, A., et al.: Reading digits in natural images with unsupervised feature learning. In: NIPS Workshop on Deep Learning and Unsupervised Feature Learning, vol. 2011 (2011). http://ufldl.stanford.edu/housenumbers/

  4. Nene, S., Nayar, S., Murase, H.: Columbia object image library (COIL-20) (1996). http://www.cs.columbia.edu/CAVE/software/softlib/coil-20.php

  5. Nene, S., Nayar, S., Murase, H.: Columbia object image library (COIL-100). http://www.cs.columbia.edu/CAVE/software/softlib/coil-100.php

  6. Coates, A., Lee, H., Ng, A.: An analysis of single-layer networks in unsupervised feature learning. In: Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, pp. 215–223 (2011). https://cs.stanford.edu/~acoates/stl10/

  7. Krizhevsky, A., Nair, V., Hinton, G.: The CIFAR dataset (2014). https://www.cs.toronto.edu/~kriz/cifar.html

  8. Fei-Fei, L., Fergus, R., Perona, P.: Learning generative visual models from few training examples: an incremental bayesian approach tested on 101 object categories. Comput. Vision Image Underst., 59–70 (2007). http://www.vision.caltech.edu/Image_Datasets/Caltech101/

  9. Griffin, G., Holub, A., Perona, P.: Caltech-256 object category dataset (2007). http://www.vision.caltech.edu/Image_Datasets/Caltech256/

  10. Marlin, B., Swersky, K., et al.: Inductive principles for restricted boltzmann machine learning. In: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. pp. 509–516 (2010). https://people.cs.umass.edu/~marlin/data.shtml

  11. Everingham, M., Gool, L., Williams, C., et al.: The pascal visual object classes (VOC) challenge. Int. J. Comput. Vision 88, 303–338 (2010). http://host.robots.ox.ac.uk/pascal/VOC/

  12. Deng, J., Dong, W., Socher, R., et al.: Imagenet: a large-scale hierarchical image database. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 248–255. IEEE (2009). http://www.image-net.org/

  13. Computer Laboratory Cambridge University: The ORL database of faces. http://www.cl.cam.ac.uk/research/dtg/attarchive/facedatabase.html

  14. Graham, D., Allinson, N., et al.: Characterising virtual eigensignatures for general purpose face recognition. In: Face Recognition, pp. 446–456. Springer (1998). https://cs.nyu.edu/~roweis/data.html

  15. Martinez, A., Benavente, R.: The AR face database, 1998. Comput. Vision Cent. Technical Report 3 (2007). http://www2.ece.ohio-state.edu/~aleix/ARdatabase

  16. Peer, P.: CVL face database. Computer Vision Lab, Faculty of Computer and Information Science, University of Ljubljana, Slovenia (2005). http://www.lrv.fri.uni-lj.si/facedb.html

  17. Gross, R., Matthews, I., Cohn, J., et al.: The CMU multi-pose, illumination, and expression (Multi-PIE) face database. CMU Robotics Institute. TR-07-08, Technical Report (2007). http://www.cs.cmu.edu/afs/cs/project/PIE/MultiPie/Multi-Pie/Home.html

  18. Goh, R., Liu, L., Liu, X.: The CMU face in action (FIA) database. In: International Workshop on Analysis and Modeling of Faces and Gestures, pp. 255–263. Springer (2005). https://www.flintbox.com/public/project/5486/

  19. Phillips, P., Wechsler, H., Huang, J., Rauss, P.: The FERET database and evaluation procedure for face-recognition algorithms. Image Vision Comput. 16, 295–306 (1998). https://www.nist.gov/itl/iad/image-group/color-feret-database

  20. Georghiades, A., Belhumeur, P., Kriegman’s, D.: The yale face database. http://vision.ucsd.edu/datasets/yale_face_dataset_original/yalefaces.zip

  21. Georghiades, A., Belhumeur, P., Kriegman’s, D.: From few to many: illumination cone models for face recognition under variable lighting and pose. IEEE Trans. Pattern Anal. Mach Intell. 23, 643–660 (2001). http://vision.ucsd.edu/~iskwak/ExtYaleDatabase/ExtYaleB.html

  22. Lang, K.: The 20 newsgroups dataset. http://qwone.com/~jason/20Newsgroups/

  23. Lewis, D.: Reuters-21578 dataset. http://www.daviddlewis.com/resources/testcollections/reuters21578/

  24. Maas, A., Daly, R., Pham, P., et al.: Learning word vectors for sentiment analysis. In: Proceedings of the 49th Annual Meeting of The Association for Computational Linguistics: Human Language Technologies, Portland, Oregon, USA, Association for Computational Linguistics, June 2011, pp. 142–150 (2011). http://www.aclweb.org/anthology/P11-1015

  25. Fan, R., Chang, K., Hsieh, C., Wang, X., Lin, C.: LIBLINEAR: a library for large linear classification. J. Mach. Learn. Res. 9, 1871–1874 (2008)

    Google Scholar 

  26. Joachims, T.: SVMlight: support vector machine. 19(4) (1999). http://svmlight.joachims.org/

  27. Djuric, N., Lan, L., Vucetic, S.: BudgetedSVM: a toolbox for scalable SVM approximations. J. Mach. Learn. Res., 3813–3817 (2013). https://sourceforge.net/p/budgetedsvm/code/ci/master/tree/matlab/

  28. Mangasarian, O., Wild, E.: Proximal support vector machine classifiers. In: Proceedings KDD-2001: Knowledge Discovery and Data Mining, pp. 77–86 (2001). http://research.cs.wisc.edu/dmi/svm/psvm/

  29. Hsieh, C., Si, S., Dhillon, I.: A Divide-and-conquer solver for kernel support vector machines. In: International Conference on Machine Learning (2014). http://www.cs.utexas.edu/~cjhsieh/dcsvm/

  30. Suykens, J., Pelckmans, K.: Least squares support vector machines. Neural Process. Lett., 293–300 (1999). https://www.esat.kuleuven.be/sista/lssvmlab/

  31. Rakotomamonjy, A., Canu, S.: SVM and kernel methods MATLAB toolbox (2008). http://asi.insa-rouen.fr/enseignants/~arakoto/toolbox/

  32. Franc, V., Hlavac, V.: Statistical pattern recognition toolbox for MATLAB. Prague, Czech: Center for Machine Perception, Czech Technical University (2004). https://cmp.felk.cvut.cz/cmp/software/stprtool/

  33. Weston, J., Elisseeff, A., Bak, G.: Spider SVM toolbox (2006). http://people.kyb.tuebingen.mpg.de/spider/

  34. Hsu, C.W., Lin, C.J.: BSVM-2.06 (2009). https://www.csie.ntu.edu.tw/~cjlin/bsvm/

  35. Ruping, S.: Mysvm–a support vector machine (2004). http://www-ai.cs.uni-dortmund.de/SOFTWARE/MYSVM/index.html

  36. Bottou, L., Bordes, A., Ertekin, S.: Lasvm (2009). http://leon.bottou.org/projects/lasvm#introduction

  37. III, H.D.: SVMseq documentation. http://legacydirs.umiacs.umd.edu/~hal/SVMsequel/

  38. Collobert, R., Bengio, S.: SVMTorch: support vector machines for large-scale regression problems. J. Mach. Learn. Res. (2001). http://bengio.abracadoudou.com/SVMTorch.html

  39. Chang, C., Lin, C.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol., 27:1–27:27 (2011). http://www.csie.ntu.edu.tw/~cjlin/libsvm

  40. Wen, Z., Shi, J., He, B., et al.: ThunderSVM: a fast SVM library on GPUs and CPUs. https://github.com/zeyiwen/thundersvm

  41. Carpenter, A.: CUSVM: a CUDA implementation of support vector classification and regression, pp. 1–9 (2009). http://patternsonascreen.net/cuSVM.html

  42. Cotter, A., Srebro, N., Keshet, J.: A GPU-tailored appproach for training kernelized SVM. In: Proceedings of the 17th ACM SIGKDD international conference on knowledge discovery and data mining, pp. 805–813 (2011). http://ttic.uchicago.edu/~cotter/projects/gtsvm/

  43. Serafini, T., Zanni, L., Zanghirati, G.: Parallel GPDT: a parallel gradient projection-based decomposition technique for support vector machines (2004). http://dm.unife.it/gpdt/

  44. Lopes, N., Ribeiro, B.: GPUMLib: a new library to combine machine learning algorithms with graphics processing units. In: 2010 10th International Conference on Hybrid Intelligent Systems (HIS), pp. 229–232 (2010). https://sourceforge.net/projects/gpumlib/?source=typ_redirect

  45. Wang, Z., Chu, T., Choate, L., et al.: Rgtsvm: support vector machines on a GPUin R. ArXiv Preprint ArXiv:1706.05544 (2017). https://github.com/Danko-Lab/Rgtsvm

  46. Vedaldi, A., Fulkerson, B.: VLFeat: an open and portable library of computer vision algorithms. In: Proceedings of the 18th ACM International Conference on Multimedia, pp. 1469–1472 (2010). http://www.vlfeat.org/install$-$matlab.html

  47. Jegou, H., Douze, M.: The yael library. In: Proceedings of the 22nd ACM International Conference on Multimedia, pp. 687–690 (2014). https://gforge.inria.fr/projects/yael/

  48. Maaten, L.: Fisher kernel learning. https://lvdmaaten.github.io/fisher/Fisher_Kernel_Learning.html

  49. Kolacek, J., Zelinka, J.: Kernel smoothing in MATLAB: theory and practice of kernel smoothing (2012). http://www.math.muni.cz/english/science-and-research/developed-software/232-matlab-toolbox.html

  50. Sonnenburg, S., Ratsch, G., Henschel, S.: J. Mach. Learn. Res., y.n.: The SHOGUN Machine Learning Toolbox

    Google Scholar 

  51. Allauzen, C., Mohri, M., Rostamizadeh, A.: Openkernel library (2007). http://www.openkernel.org/twiki/bin/view/Kernel/WebHome

  52. Orabona, F.: DOGMA: A MATLAB toolbox for online learning (2009). http://dogma.sourceforge.net

  53. Sun, Z., Ampornpunt, N., Varma, M., Vishwanathan, S.: Multiple kernel learning and the SMO algorithm. In: Advances in Neural Information Processing Systems (2010). http://manikvarma.org/code/SMO-MKL/download.html

  54. Gonen, M., Alpaydin, E.: Multiple kernel learning algorithms. J. Mach. Learn. Res. (2011). https://users.ics.aalto.fi/gonen/jmlr11.php

  55. Tsai, M.H.: LIBLINEAR MKL: a fast multiple kernel learning L1/L2-loss SVM solver in MATLAB. https://www.csie.ntu.edu.tw/~b95028/software/liblinear-mkl/

  56. Varma, M., Babu, R.: More generality in efficient multiple kernel learning. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 1065–1072 (2009). http://manikvarma.org/code/GMKL/download.html

  57. Gonen, M., Alpaydn, E.: Localized algorithms for multiple kernel learning. Pattern Recognit. (2013). https://users.ics.aalto.fi/gonen/icpr10.php

  58. Strobl, E., Visweswaran, S.: Deep multiple kernel learning. In: 2013 12th International Conference on Machine Learning and Applications (ICMLA) (2013). https://github.com/ericstrobl/deepMKL

  59. Shawe-Taylor, J.: Kernel methods for pattern analysis (2004). https://www.kernel-methods.net/matlab_tools

  60. Chen, M.: Pattern recognition and machine learning toolbox. MATLAB Central File Exchange (2016). https://www.mathworks.com/matlabcentra/fileexchange/55826-pattern-recognition-and-machine-learning-toolbox

  61. Salakhutdinov, R., Hinton, G.: Deep Boltzmann machines. In: Proceedings of the International Conference on Artificial Intelligence and Statistics, vol. 5, pp. 448–455 (2009). http://www.cs.toronto.edu/~rsalakhu/DBM.html

  62. Rasmusbergpalm: Restricted Boltzmann Machine. https://code.google.com/archive/p/matrbm/

  63. Salakhutdinov, R., Hinton, G.: Restricted Boltzmann machines for collaborative filtering. In: Proceedings of the 24th International Conference on Machine Learning, pp. 791–798 (2007). http://www.cs.toronto.edu/~rsalakhu/rbm_ais.html/

  64. Gallamine, W.: Deep belief network. https://github.com/gallamine/DBN

  65. Demuth, H., Beale, M.: Neural Network Toolbox for Use with Matlab–User’s Guide verion 3.0. (1993). https://www.mathworks.com/help/nnet/getting-started-with-neural-network-toolbox.html

  66. Srivastava, N.: DeepNet: a library of deep learning algorithms. http://www.cs.toronto.edu/~nitish/deepnet

  67. Krizhevsky, A.: Cuda-convnet: High-performance C++/Cuda implementation of convolutional neural networks (2012). https://code.google.com/archive/p/cuda-convnet2/

  68. Abadi, M., Barham, P., et al.: TensorFlow: a system for large-scale machine learning. In: OSDI, vol. 16, pp. 265–283 (2016). https://www.tensorflow.org/

  69. Collobert, R., Kavukcuoglu, K., Farabet, C.: Torch. In: Workshop on Machine Learning Open Source Software, NIPS, vol. 113 (2008). http://torch.ch/

  70. Seide, F.: Keynote: the computer science behind the Microsoft cognitive toolkit: an open source large-scale deep learning toolkit for windows and linux. In: IEEE/ACM International Symposium on Code Generation and Optimization (CGO), pp. xi–xi (2017). https://www.microsoft.com/en-us/cognitive-toolkit/

  71. Bergstra, J., Bastien, F., et al.: Theano: deep learning on GPUS with python. In: NIPS 2011, Big Learning Workshop, Granada, Spain, vol. 3, pp. 1–48 (2011). http://deeplearning.net/software/theano/

  72. Jia, Y., Shelhamer, E., et al.: Caffe: convolutional architecture for fast feature embedding. In: Proceedings of the 22nd ACM International Conference on Multimedia, pp. 675–678 (2014) http://caffe.berkeleyvision.org/

  73. Chollet, F.: Keras (2015). https://keras.io/

  74. Chen, T., Li, M., Li, Y., et al.: Mxnet: A flexible and efficient machine learning library for heterogeneous distributed systems. ArXiv Preprint ArXiv:1512.01274 (2015). https://mxnet.apache.org/

  75. Gibson, A., Nicholson, C., Patterson, J.: Deeplearning4j: open-source distributed deep learning for the JVM. Apache Softw. Found. License 2 (2016). https://deeplearning4j.org/

  76. Tokui, S., Oono, K., Hido, S.: Chainer: a next-generation open source framework for deep learning. In: Proceedings of Workshop on Machine Learning Systems in The Twenty-Ninth Annual Conference on Neural Information Processing Systems (NIPS) (2015). https://chainer.org/

  77. Neon, N.: Nervana systems. https://neon.nervanasys.com/index.html/

  78. Ye, C., Zhao, C., Yang, Y., Fermlle, C.: Lightnet: a versatile, standalone matlab-based environment for deep learning. In: Proceedings of the 2016 ACM on Multimedia Conference, pp. 1156–1159 (2016). https://github.com/yechengxi/LightNet

  79. Chin, B., Lee, K., Wang, S., et al.: SINGA: a distributed deep learning platform. In: Proceedings of the 23rd ACM International Conference on Multimedia. pp. 685–688 (2015). https://singa.incubator.apache.org/en/index.html

  80. Yan, K.: Feature selection toolbox. https://www.mathworks.com/matlabcentral/fileexchange/56723-yan-prtools

  81. Duin, R.P.W.: Prtools Version 3.0: a matlab toolbox for pattern recognition. In: Proceedings of the SPIE (2000). http://prtools.org/software/

  82. Somol, P., Vacha, P., Mikes, S., et al.: Introduction to feature selection toolbox 3–the C++ library for subset search, data modeling and classification. Research Report for Institute of Information Theory and Automation, Academy of Sciences of the Czech Republic (2010). http://fst.utia.cz/?fst3

  83. Lab, S.S.H.: Maximum likelihood feature selection (MLFS), University of Tokyo. http://www.ms.k.u-tokyo.ac.jp/software.html#MLFS

  84. Kanamori, T., Sugiyama, M.: A least-squares approach to direct importance estimation. J. Mach. Learn. Res. 10, 1391–1445 (2009). http://www.ms.k.u-tokyo.ac.jp/software.html#LSFS

  85. Jitkrittum, W., Sugiyama, M.: Feature selection Via L1-penalized squared-loss mutual information. IEICE Trans. Inf. Syst. 96, 1513–1524 (2013). http://wittawat.com/pages/l1lsmi.html

  86. Roffo, G.: Feature selection library (MATLAB Toolbox). ArXiv Preprint ArXiv:1607.01327 (2016). https://www.mathworks.com/matlabcentral/fileexchange/56937-feature-selection-library

  87. Maaten, L.: Matlab toolbox for dimensionality reduction (2007). https://lvdmaaten.github.io/drtoolbox

  88. Salakhutdinov, R., Hinton, G.: Reducing the dimensionality of data with neural networks. Science 313, 504–507 (2006). http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html

  89. Maaten, L.: Learning a parametric embedding by preserving local structure. In: Artificial Intelligence and Statistics, pp. 384–391 (2009). https://lvdmaaten.github.io/tsne/

  90. He, X., Cai, D., et al.: Neighborhood preserving embedding. In: Tenth IEEE International Conference on Computer Vision, vol. 2, pp. 1208–1213 (2005). http://www.cad.zju.edu.cn/home/dengcai/Data/DimensionReduction.html

  91. Cai, D., He, X., Zhou, K., Han, J., Bao, H.: Locality sensitive discriminant analysis. In: International Joint Conference on Artificial Intelligence (2007). http://www.cad.zju.edu.cn/home/dengcai/Data/DimensionReduction.html

  92. Cai, D., He, X., Han, J.: Semi-supervised discriminant analysis. In: Proceedings of International Conference on Computer Vision (2007). http://www.cad.zju.edu.cn/home/dengcai/Data/DimensionReduction.html

  93. He, X., Cai, D., Han, J.: Learning a maximum margin subspace for image retrieval. IEEE Trans. Knowl. Data Eng. 20 (2008). http://www.cad.zju.edu.cn/home/dengcai/Data/DimensionReduction.html

  94. Suzuki, T., Sugiyama, M.: Sufficient dimension reduction via squared-loss mutual information estimation, pp. 804–811 (2010). http://www.ms.k.u-tokyo.ac.jp/software.html#LSDR

  95. Sugiyama, M., Ide, T., et al.: Semi-supervised local fisher discriminant analysis for dimensionality reduction. Mach. Learn. 78, 35 (2010). http://www.ms.k.u-tokyo.ac.jp/software.html#SELF

  96. Sugiyama, M.: Local fisher discriminant analysis for supervised dimensionality reduction. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 905–912. ACM (2006). http://www.ms.k.u-tokyo.ac.jp/software.html#LFDA

  97. LI, W.: Learning to hashing. https://cs.nju.edu.cn/lwj/L2H.html

  98. Jegou, H., Douze, M., Schmid, C.: Product quantization for nearest neighbor search. IEEE Trans. Pattern Anal. Mach. Intell. 33(1), 117–128 (2011). http://people.rennes.inria.fr/Herve.Jegou/projects/ann.html

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tayyaba Azim .

Rights and permissions

Reprints and permissions

Copyright information

© 2018 The Author(s), under exclusive licence to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Azim, T., Ahmed, S. (2018). Open Source Knowledge Base for Machine Learning Practitioners. In: Composing Fisher Kernels from Deep Neural Models. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-98524-4_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-98524-4_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-98523-7

  • Online ISBN: 978-3-319-98524-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics