Advertisement

Local Feature Selection by Formal Concept Analysis for Multi-class Classification

  • Madori Ikeda
  • Akihiro Yamamoto
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8643)

Abstract

In this paper, we propose a multi-class classification algorithm to apply it to data sets increasing frequently. The algorithm performs lazy learning based on formal concept analysis. We designed it so that it obtains localness in predicting classes of test data and feature selection simultaneously. From a given data set that consists of a set of training data and a set of test data, the algorithm generates a single formal concept lattice. Every formal concept in the lattice represents a cluster of data that are generated by various feature selections. In order to classify each test datum, plausible clusters are selected and combined into a set of neighbors for the test datum. Our algorithm can construct sets of neighbors for test data that are never generated by other algorithms, e.g., the \(k\)-nearest neighbor algorithm and decision tree classifiers. We compare our algorithm with other algorithms by experiments using UCI datasets and show that ours is comparable to the others at the viewpoint of correctness.

Keywords

Lazy learning Multi-class classification Formal concept analysis Feature selection 

References

  1. 1.
    Bache, K., Lichman, M.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA (2013). http://archive.ics.uci.edu/ml
  2. 2.
    Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)Google Scholar
  3. 3.
    Choi, V., Huang, Y.: Faster algorithms for constructing a galois lattice, enumerating all maximal bipartite cliques and closed frequent sets. In: SIAM Conference on Discrete Mathematics (2006)Google Scholar
  4. 4.
    Davey, B.A., Priestly, H.A.: Introduction to Lattice and Order. Cambridge University Press, Cambridge (2002)CrossRefGoogle Scholar
  5. 5.
    Ganter, B., Wille, R.: Formal Concept Analysis: Mathematical Foundations. Springer-Verlag New York Inc., Secaucus (1999)CrossRefMATHGoogle Scholar
  6. 6.
    Kaytoue, M., Kuznetsov, S.O., Napoli, A., Duplessis, S.: Mining gene expression data with pattern structures in formal concept analysis. J. Inf. Sci. 181(10), 1989–2001 (2011)CrossRefMathSciNetGoogle Scholar
  7. 7.
    Kira, K., Rendell, L.A.: The feature selection problem: traditional methods and a new algorithm. In: Proceedings of AAAI’92, pp. 124–134 (1992)Google Scholar
  8. 8.
    Makino, K., Uno, T.: New algorithms for enumerating all maximal cliques. In: Hagerup, T., Katajainen, J. (eds.) SWAT 2004. LNCS, vol. 3111, pp. 260–272. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  9. 9.
    Pedregosa, F., et al.: Scikit-learn: machine learning in python. JMLR 12, 2825–2830 (2011)MATHMathSciNetGoogle Scholar
  10. 10.
    Quinlan, J.R.: Induction of decision trees. Mach. Learn. 1(1), 81–106 (1986)Google Scholar
  11. 11.
    Soldano, H., Ventos, V., Champesme, M., Forge, D.: Incremental construction of alpha lattices and association rules. In: Setchi, R., Jordanov, I., Howlett, R.J., Jain, L.C. (eds.) KES 2010, Part II. LNCS, vol. 6277, pp. 351–360. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  12. 12.
    Uno, T., Kiyomi, M., Arimura, H.: LCM ver. 3: collaboration of array, bitmap and prefix tree for frequent itemset mining. In: Proceedings of the 1st International Workshop on Open Source Data Mining: Frequent Pattern Mining Implementations, pp. 77–86. ACM (2005)Google Scholar
  13. 13.
    Valtchev, P., Missaoui, R.: Building concept (Galois) lattices from parts: generalizing the incremental methods. In: Proceedings of the ICCS’01, pp. 290–303 (2001)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Graduate School of InformaticsKyoto UniversityKyotoJapan

Personalised recommendations