Advertisement

Cognitive Computation

, Volume 10, Issue 6, pp 922–936 | Cite as

Super-Graph Classification Based on Composite Subgraph Features and Extreme Learning Machine

  • Jun PangEmail author
  • Yuhai Zhao
  • Jia Xu
  • Yu Gu
  • Ge Yu
Article
  • 69 Downloads

Abstract

A multi-graph is modeled as a bag of graphs, whose mutual relationships can be used to enhance the accuracy of multi-graph classification. However, to the best of our knowledge, research on utilizing those mutual relationships has not been reported. In this paper, we propose a novel super-graph model \(SG=(MG,AG)\), where MG denotes a multi-graph and AG represents a graph (named abstract-graph), that describes the mutual relationships among the graphs contained in MG. The super-graph classification problem is challenging to solve because of the very complex structure of the super-graph model. Furthermore, it is hard to directly select distinguished subgraphs, i.e., subgraph features, from super-graphs. A subgraph g of graph G is a graph that is isomorphic with one of the substructures of G. Moreover, the practical applications require the super-graph classification algorithm to have high precision. In this paper, we propose a concept and algorithm for selecting composite subgraph features, based on which a framework is proposed to solve the super-graph classification problem. Subgraph features denote subgraphs that can be used to distinguish super-graphs with different class labels. We first design a two-step approach to select k composite subgraph features from the subgraphs of super-graphs’ abstract-graphs and multi-graphs. Then, based on composite features and the subgraph feature representation of a super-graph, each super-graph SG is transformed into a 0-1 vector with k dimensions. If there exists a substructure in SG that is isomorphic with its i th composite feature, the i th component of the target vector is set to 1 (1 ≤ ik). Otherwise, it is set to 0. Based on the derived k-dimensional vectors, one of the existing classification algorithms is used to construct a prediction model to predict the class labels of the unseen super-graphs, such as naive Bayes or support vector machine (SVM). Specifically, we adapt the extreme learning machine (ELM) algorithm to further improve the accuracy of super-graph classification. In this paper, we propose a super-graph model and study the problem of super-graph classification. We first derive the concept of composite subgraph features that are selected by our proposed two-step method. Based on the mined composite subgraph features, we propose a super-graph classification framework (SGC) to solve the super-graph classification problem. Moreover, ELM can be used to further improve the classification accuracy. Extensive experiments on real-world image datasets show that our algorithm based on ELM is more accurate than the baseline algorithms.

Keywords

Multi-graph Super-graph Classification Extreme learning machine 

Notes

Funding

The work is partially supported by the National Natural Science Foundation of China (No. 61702381, No. 61772124, No. 61872070), the Hubei Natural Science Foundation (No. 2017CFB196), the Scientific Research Foundation of Wuhan University of Science and Technology (2017xz015), and the Fundamental Research Funds for the Central Universities (150402002, 171605001). Jia Xu is supported the Key Projects of Higher Education Undergraduate Teaching Reform Project in Guangxi (No. 2017JGZ103) and the Scientific Research Foundation of GuangXi University (No. XGZ141182).

Compliance with Ethical Standards

Conflict of interests

The authors declare that they have no conflict of interest.

Ethical Approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

References

  1. 1.
    Wu J, Hong Z, Pan S, et al. Multi-graph learning with positive and unlabeled bags. SDM; 2014. p. 217–25.Google Scholar
  2. 2.
    Wu J, Zhu X, Zhang C, et al. Bag constrained structure pattern mining for multi-graph classification. TKDE 2014;26(10):2382–96.Google Scholar
  3. 3.
    Wu J, Pan S, Zhu X, et al. Boosting for multi-graph classification. T Cybern 2015;45(3):430–43.Google Scholar
  4. 4.
    Achanta R, Shaji A, Smith K, et al. SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans Pattern Anal Mach Intell 2012;34(11):2274–82.PubMedCrossRefGoogle Scholar
  5. 5.
    Pang J, Gu Y, Xu J, et al. Parallel multi-graph classification using extreme learning machine and MapReduce. ELM; 2015. p. 77–92.Google Scholar
  6. 6.
    Oneto L, Bisio F, Cambria E. SLT-based ELM for big social data analysis. Cogn Comput 2017;9(2): 259–74.CrossRefGoogle Scholar
  7. 7.
    Wang J, Ye K, Cao J, et al. DOA estimation of excavation devices with ELM and MUSIC-based hybrid algorithm. Cogn Comput 2017;9(4):564–80.CrossRefGoogle Scholar
  8. 8.
    Zhang Z, Zhao X, Wang G. FE-ELM: a new friend recommendation model with extreme learning machine. Cogn Comput 2017;9(5):659–70.CrossRefGoogle Scholar
  9. 9.
    Kong X, Yu P. Semi-supervised feature selection for graph classification. KDD; 2010. p. 793–802.Google Scholar
  10. 10.
    Yan X, Cheng H, Han J, et al. Mining significant graph patterns by leap search. SIGMOD; 2008. p. 433–44.Google Scholar
  11. 11.
    Ranu S, Singh A. Graphsig: a scalable approach to mining significant subgraphs in large graph databases. ICDE; 2009 . p. 844–55.Google Scholar
  12. 12.
    Jin N, Wang W. LTS: discriminative subgraph mining by learning from search history. ICDE; 2011. p. 207–18.Google Scholar
  13. 13.
    Zhu Y, Yu J, Cheng H, et al. Graph classification: a diversified discriminative feature selection approach. CIKM; 2012. p. 205–14.Google Scholar
  14. 14.
    Huang G, Zhu Q, Siew CK. Extreme learning machine: a new learning scheme of feedforward neural networks. IJCNN; 2004. p. 985–90.Google Scholar
  15. 15.
    Huang G, Liang N, Rong H, et al. On-line sequential extreme learning machine. IASTED; 2005. p. 232–7.Google Scholar
  16. 16.
    Huang G, Zhu Q, Siew CK, et al. Extreme learning machine: theory and applications. Neurocomputing 2006;70(1–3):489–501.CrossRefGoogle Scholar
  17. 17.
    Huang G, Chen L. Convex incremental extreme learning machine. Neurocomputing 2007;70(16–18):3056–62.CrossRefGoogle Scholar
  18. 18.
    Huang G, Chen L. Enhanced random search based incremental extreme learning machine. Neurocomputing 2008;71(16–18):3460–8.CrossRefGoogle Scholar
  19. 19.
    Huang G, Ding X, Zhou H. Optimization method based extreme learning machine for classification. Neurocomputing 2010;74(1–3):155–63.CrossRefGoogle Scholar
  20. 20.
    Huang G, Wang D, Lan Y. Extreme learning machines: a survey. Int J Mach Learn Cybern 2011;2(2): 107–22.CrossRefGoogle Scholar
  21. 21.
    Huang G, Zhou H, Ding X, et al. Extreme learning machine for regression and multiclass classification. Trans Syst Man Cybern B Cybern 2012;42(2):513–29.CrossRefGoogle Scholar
  22. 22.
    Huang G, Wang D. Advances in extreme learning machines (ELM2011). Neurocomputing 2013;102:1–2.CrossRefGoogle Scholar
  23. 23.
    Huang G. An insight into extreme learning machines: random neurons, random features and Kernels. Cogn Comput 2014;6(3):376–90.CrossRefGoogle Scholar
  24. 24.
    Xin J, Wang Z, Chen C, et al. ELM: distributed extreme learning machine with MapReduce. World Wide Web 2014;17(5):1189–204.CrossRefGoogle Scholar
  25. 25.
    Xin J, Wang Z, Qu L, et al. Elastic extreme learning machine for big data classification. Neurocomputing 2015;149(Part A):464–71.CrossRefGoogle Scholar
  26. 26.
    Bi X, Zhao X, Wang G, et al. Distributed extreme learning machine with Kernels based on MapReduce. Neurocomputing 2015;149:456–63.CrossRefGoogle Scholar
  27. 27.
    Wang B, Huang S, Qiu J, et al. Parallel online sequential extreme learning machine based on MapReduce. Neurocomputing 2015;149:224–32.CrossRefGoogle Scholar
  28. 28.
    Huang G, Bai X, Kasun LLC, et al. Local receptive fields based extreme learning machine. Comput Intell Mag 2015;10(2):18–29.CrossRefGoogle Scholar
  29. 29.
    Huang G. What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput 2015;7(3):263–78.CrossRefGoogle Scholar
  30. 30.
    Kasun L, Yang Y, Huang G, et al. Dimension reduction with extreme learning machine. IEEE Trans Image Process 2016;25(8):3906–18.PubMedCrossRefGoogle Scholar
  31. 31.
    Cui D, Huang G, Liu T. Smile detection using pair-wise distance vector and extreme learning machine. International joint conference on neural networks (IJCNN); 2016. p. 2298–305.Google Scholar
  32. 32.
    Tang J, Deng C, Huang G. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learning Syst 2016;27(4):809–21.CrossRefGoogle Scholar
  33. 33.
    Man Z, Huang G. Guest editorial: special issue on Extreme learning machine and applications(I). Neural Comput & Applic 2016;27(1):1–12.CrossRefGoogle Scholar
  34. 34.
    Man Z, Huang G. Guest editorial: special issue on extreme learning machine and applications(II). Neural Comput & Applic 2016;27(2):253–4.CrossRefGoogle Scholar
  35. 35.
    Wang G, YZ, Wang D. A protein secondary structure prediction framework based on the extreme learning machine. Neurocomputing 2008;72(1–3):262–8.CrossRefGoogle Scholar
  36. 36.
    Sun Y, Yuan Y, Wang G. An OS-ELM based distributed ensemble classification framework in P2P networks. Neurocomputing 2011;74(16):2438–43.CrossRefGoogle Scholar
  37. 37.
    Zhao X, Wang G, Bin X, et al. XML document classification based on ELM. Neurocomputing 2011; 74(16):2444–51.CrossRefGoogle Scholar
  38. 38.
    Akusok A, Bjork KM, Miche Y, et al. High-performance extreme learning machines: a complete toolbox for big data applications. IEEE Access 2015;3:1011–25.CrossRefGoogle Scholar
  39. 39.
    Wang Z, Zhao Y, Wang G, et al. On extending extreme learning machine to non-redundant synergy pattern based graph classification. Neurocomputing 2015;149:330–9.CrossRefGoogle Scholar
  40. 40.
    Han D, Hu Y, Ai S, et al. Uncertain graph classification based on extreme learning machine. Cogn Comput 2015;7(3):346–58.CrossRefGoogle Scholar
  41. 41.
    Cao K, Wang G, Han D, et al. Classification of uncertain data streams based on extreme learning machine. Cogn Comput 2015;7(1):150–60.CrossRefGoogle Scholar
  42. 42.
  43. 43.
  44. 44.
  45. 45.
    Naive Bayes EM algorithm. http://www.mloss.org/software/view/357/.
  46. 46.
    Joachims T. SVM-light http://svmlight.joachims.org/. 2017.
  47. 47.
    Zhou ZH, Sun YY, Li YF. Multi-instance learning by treating instances as Non-I.I.D. samples. International conference on machine learning; 2009. p. 1249–56.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.College of Computer Science and TechnologyWuhan University of Science and TechnologyWuhanChina
  2. 2.Hubei Province Key Laboratory of Intelligent Information Processing and Real-time Industrial SystemWuhanChina
  3. 3.Key Laboratory of Rich-media Knowledge Organization and Service of Digital Publishing ContentSAPPRFTBeijingChina
  4. 4.School of Computer Science and EngineeringNortheastern UniversityShenyangChina
  5. 5.School of Computer, Electronics and InformationGuangxi UniversityNanningChina

Personalised recommendations