Skip to main content
Log in

AEKOC+: Kernel Ridge Regression-Based Auto-Encoder for One-Class Classification Using Privileged Information

  • Published:
Cognitive Computation Aims and scope Submit manuscript


In recent years, non-iterative learning approaches for kernel have received quite an attention by researchers and kernel ridge regression (KRR) approach is one of them. Recently, KRR-based Auto-Encoder is developed for the one-class classification (OCC) task and named as AEKOC. OCC is generally used for outlier or novelty detection. The brain can detect outlier just by learning from only normal samples. Similarly, OCC also uses only normal samples to train the model, and trained model can be used for outlier detection. In this paper, AEKOC is enabled to utilize privileged information, which is generally ignored by AEKOC or any traditional machine learning technique but usually present in human learning. For this purpose, we have combined learning using privileged information (LUPI) framework with AEKOC, and proposed a classifier, which is referred to as AEKOC+. Privileged information is only available during training but not during testing. Therefore, AEKOC is unable to utilize this information for building the model. However, AEKOC+ can efficiently handle the privileged information due to the inclusion of the LUPI framework with AEKOC. Experiments have been conducted on MNIST dataset and on various other datasets from UCI machine learning repository, which demonstrates the superiority of AEKOC+ over AEKOC. Our formulation shows that AEKOC does not utilize the privileged features in learning; however, formulation of AEKOC+ helps it in learning from the privileged features differently from other available features and improved generalization performance of AEKOC. Moreover, AEKOC+ also outperformed two LUPI framework–based one-class classifiers (i.e., OCSVM+ and SSVDD+).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others


  1. Some researchers [45, 46] followed the name of kernel extreme learning machine (KELM) [49,50,51], and some researchers followed the name of KRR [42, 47] instead of KELM. We do not want to go in the debate of the naming convention. Since there are no differences in the final solution of KELM and KRR, we decided to follow the traditional name KRR instead of KELM. However, we kept both names in this paper to avoid the confusion, i.e., KRR/KELM-based Auto-Encoder for OCC is referred to as AEKOC/AAKELM in this paper.




  1. Wang S, Chen S, Chen T, Shi X. Learning with privileged information for multi-label classification. Pattern Recogn 2018;81:60–70.

    Article  Google Scholar 

  2. Chevalier M, Thome N, Hénaff G, Cord M. Classifying low-resolution images by integrating privileged information in deep CNNs. Pattern Recogn Lett 2018;116:29–35.

    Article  Google Scholar 

  3. Lambert J, Sener O, Savarese S. Deep learning under privileged information using heteroscedastic dropout. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2018. p. 8886–8895.

  4. Burnaev E, Smolyakov D. One-class SVM with privileged information and its application to malware detection. IEEE International Conference on Data Mining Workshops, ICDM workshops; 2016. p. 12–15.

  5. Motiian S, Piccirilli M, Adjeroh DA, Doretto G. Information bottleneck learning using privileged information for visual recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2016. p. 1496–1505.

  6. Zhang W. Support vector data description using privileged information. Electron Lett 2015;51(14):1075–1076.

    Article  Google Scholar 

  7. Vapnik V, Izmailov R. Learning using privileged information: similarity control and knowledge transfer. J Mach Learn Res 2015;16(2023-2049):2.

    Google Scholar 

  8. Lapin M, Hein M, Schiele B. Learning using privileged information: SVM+ and weighted SVM. Neural Netw 2014;53:95–108.

    Article  PubMed  Google Scholar 

  9. Zhu W, Zhong P. A new one-class SVM based on hidden information. Knowl-Based Syst 2014;60:35–43.

    Article  Google Scholar 

  10. Feyereisl J, Aickelin U. Privileged information for data clustering. Inf Sci 2012;194:4–23.

    Article  Google Scholar 

  11. Vapnik V, Vashist A. A new learning paradigm: learning using privileged information. Neural Netw 2009; 22(5-6):544–557.

    Article  PubMed  Google Scholar 

  12. Xu X, Li W, Xu D. Distance metric learning using privileged information for face verification and person re-identification. IEEE Trans Neural Netw Learn Sys 2015;26(12):3150–3162.

    Article  Google Scholar 

  13. Li W, Niu L, Xu D. Exploiting privileged information from web data for image categorization. European Conference on Computer Vision. Springer; 2014. p. 437–452.

  14. Niu L, Li W, Xu D. Exploiting privileged information from web data for action and event recognition. Int J Comput Vis 2016;118(2):130–150.

    Article  Google Scholar 

  15. Meng F, Qi Z, Tian Y, Niu L. Pedestrian detection based on the privileged information. Neural Comput & Applic 2018;29(12):1485–1494.

    Article  Google Scholar 

  16. Moya MM, Koch MW, Hostetler LD. 1993. One-class classifier networks for target recognition applications. Technical report, Sandia National Labs., Albuquerque, NM (United States).

  17. Khan SS, Madden MG. A survey of recent trends in one class classification. Irish Conference on Artificial Intelligence and Cognitive Science. Springer; 2009. p. 188–197.

  18. Pimentel MA, Clifton DA, Clifton L, Tarassenko L. A review of novelty detection. Signal Process 2014;99:215–249.

    Article  Google Scholar 

  19. Xu Y, Liu C. A rough margin-based one class support vector machine. Neural Comput & Applic 2013;22(6): 1077–1084.

    Article  Google Scholar 

  20. Hamidzadeh J, Moradi M. Improved one-class classification using filled function. Appl Intell. 2018:1–17.

  21. Gepperth ART, Hecht T, Gogate M. A generative learning approach to sensor fusion and change detection. Cognitive Computation 2016;8(5):806–817.

    Article  Google Scholar 

  22. Justo R, Alcaide JM, Torres MI, Walker M. Detection of sarcasm and nastiness: new resources for Spanish language. Cognitive Computation. 2018:1–17.

  23. Anbar M, Abdullah R, Al-Tamimi BN, Hussain A. A machine learning approach to detect router advertisement flooding attacks in next-generation IPv6 networks. Cognitive Computation 2018;10(2):201–214.

    Article  Google Scholar 

  24. Tax DMJ. One-class classification; concept-learning in the absence of counter-examples. ASCI dissertation series. 2001:65.

  25. Janakiraman VM, Nielsen D. Anomaly detection in aviation data using extreme learning machines. 2016 International Joint Conference on Neural Networks (IJCNN). IEEE; 2016 . p. 1993–2000.

  26. Yan W. One-class extreme learning machines for gas turbine combustor anomaly detection. 2016 International Joint Conference on Neural Networks (IJCNN). IEEE; 2016. p. 2909–2914.

  27. Gautam C, Tiwari A. Localized multiple kernel support vector data description. 2018 IEEE International Conference on Data Mining Workshops (ICDMW). IEEE; 2018. p. 1514–1521.

  28. Gautam C, Balaji R, Sudharsan K, Tiwari A, Ahuja K. Localized multiple kernel learning for anomaly detection: one-class classification. Knowl-Based Syst 2019;165:241–252.

    Article  Google Scholar 

  29. Cai W, Zheng J, Pan W, Lin J, Li L, Chen L, Peng X, Ming Z. Neighborhood-enhanced transfer learning for one-class collaborative filtering. Neurocomputing 2019;341:80–87.

    Article  Google Scholar 

  30. Zhou W, Li J, Zhou Y, Memon MH. Bayesian pairwise learning to rank via one-class collaborative filtering. Neurocomputing 2019;367:176–187.

    Article  Google Scholar 

  31. Krawczyk B, Triguero I, García S, Woźniak M, Herrera F. Instance reduction for one-class classification. Knowl Inf Syst 2019;59(3):601–628.

    Article  Google Scholar 

  32. Manevitz LM, Yousef M. One-class SVMs for document classification. J Mach Learn Res 2001;2(Dec): 139–154.

    Google Scholar 

  33. Yin S, Zhu X, Jing C. Fault detection based on a robust one class support vector machine. Neurocomputing 2014;145:263–268.

    Article  Google Scholar 

  34. Mourão-Miranda J, Hardoon DR, Hahn T, Marquand AF, Williams SCR, Shawe-Taylor J, Brammer M. Patient classification as an outlier detection problem: an application of the one- class support vector machine. Neuroimage 2011;58(3):793–804.

    Article  PubMed  Google Scholar 

  35. Mygdalis V, Iosifidis A, Tefas A, Pitas I. One class classification applied in facial image analysis. 2016 IEEE International Conference on Image Processing (ICIP). IEEE; 2016. p. 1644–1648.

  36. Kozerawski J, Turk M. Clear: Cumulative learning for one-shot one-class image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2018. p. 3446–3455.

  37. Tax DMJ, Duin RPW. Support vector data description. Mach Learn 2004;54(1):45–66.

    Article  Google Scholar 

  38. Schölkopf B, Williamson RC, Smola AJ, Shawe-Taylor J, Platt JC. Support vector method for novelty detection. NIPS; 1999. p. 582–588.

  39. Tax DMJ, Duin RPW. Support vector domain description. Pattern Recogn Lett 1999;20(11):1191–1199.

    Article  Google Scholar 

  40. Saunders C, Gammerman A, Vovk V. Ridge regression learning algorithm in dual variables. Proceedings of the Fifteenth International Conference on Machine Learning, ICML ’98. San Francisco: Morgan Kaufmann Publishers Inc; 1998. p. 515–521.

  41. Wornyo DK, Shen X-J, Dong Y, Wang L, Huang S-C. Co-regularized kernel ensemble regression. World Wide Web. 2018:1–18.

  42. Zhang L, Suganthan PN. Benchmarking ensemble classifiers with novel co-trained kernel ridge regression and random vector functional link ensembles [research frontier]. IEEE Comput Intell Mag 2017;12(4):61–72.

    Article  Google Scholar 

  43. He J, Ding L, Jiang L, Ma L. Kernel ridge regression classification. 2014 International Joint Conference on Neural Networks (IJCNN). IEEE; 2014. p. 2263–2267.

  44. Wu P-Y, Fang C-C, Chang JM, Kung S-Y. Cost-effective kernel ridge regression implementation for keystroke-based active authentication system. IEEE Transactions on Cybernetics 2017;47(11):3916–3927.

    Article  PubMed  Google Scholar 

  45. Leng Q, Qi H, Miao J, Zhu W, Su G. One-class classification with extreme learning machine. Math Probl Eng. 2014:1–11.

    Article  Google Scholar 

  46. Gautam C, Tiwari A, Leng Q. On the construction of extreme learning machine for online and offline one-class classification—an expanded toolbox. Neurocomputing 2017;261:126–143.

    Article  Google Scholar 

  47. Gautam C, Tiwari A, Iosifidis A. Minimum variance-embedded multi-layer kernel ridge regression for one-class classification. 2018 IEEE Symposium Series on Computational Intelligence (SSCI). IEEE; 2018. p. 389–396.

  48. Gautam C, Tiwari A, Suresh S, Ahuja K. Adaptive online learning with regularized kernel for one-class classification. IEEE Transactions on Systems, Man, and Cybernetics: Systems. 2019:1–16.

  49. Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: theory and applications. Neurocomputing 2006;70(1-3):489–501.

    Article  Google Scholar 

  50. Huang G-B, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Transactions on Systems, Man, and Cybernetics Part B (Cybernetics) 2011;42(2):513–529.

    Article  Google Scholar 

  51. Huang G-B. An insight into extreme learning machines: random neurons, random features and kernels. Cognitive Computation 2014;6(3):376–390.

    Article  Google Scholar 

  52. Argyriou A, Micchelli CA, Pontil M. When is there a representer theorem? Vector versus matrix regularizers. J Mach Learn Res 2009;10(Nov):2507–2529.

    Google Scholar 

  53. TU Delft one-class dataset repository. Last Accessed by 16 July 2019.

  54. Zhu M. 2004. Recall, precision and average precision. Department of Statistics and Actuarial Science, University of Waterloo, Waterloo, 2:30.

  55. Manning CD, Raghavan P, Schütze H. 2008. Introduction to information retrieval, chapter 13.

  56. Li W, Dai D, Tan M, Xu D, Gool LV. Fast algorithms for linear and kernel SVM+. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2016. p. 2258–2266.

  57. Demšar J. Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 2006;7(Jan):1–30.

    Google Scholar 

  58. Ieracitano C, Mammone N, Bramanti A, Hussain A, Morabito FC. A convolutional neural network approach for classification of dementia stages based on 2D-spectral representation of EEG recordings. Neurocomputing 2019;323:96–107.

    Article  Google Scholar 

  59. Gao F, Huang T, Sun J, Wang J, Hussain A, Yang E. A new algorithm for SAR image target recognition based on an improved deep convolutional neural network. Cognitive Computation. 2018: 1–16.

  60. Ma Y, Peng H, Khan T, Cambria E, Hussain A. Sentic LSTM: a hybrid network for targeted aspect-based sentiment analysis. Cognitive Computation 2018;10(4):639–650.

    Article  Google Scholar 

Download references


This research is supported by the Department of Electronics and Information Technology (DeITY, Govt. of India) under Visvesvaraya PhD scheme for electronics & IT. This work is also supported by Science and Engineering Research Board (SERB) funded Research Project, Government of India under Early Career Research Award Scheme, Grant No. ECR/2017/000053 and Council of Scientific & Industrial Research (CSIR), New Delhi, INDIA under Extra Mural Research (EMR) Scheme grant no. 22(0751)/17/EMR-II. We gratefully acknowledge the Indian Institute of Technology Indore for providing facilities and support.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Chandan Gautam.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gautam, C., Tiwari, A. & Tanveer, M. AEKOC+: Kernel Ridge Regression-Based Auto-Encoder for One-Class Classification Using Privileged Information. Cogn Comput 12, 412–425 (2020).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: