Advertisement

Classifier-Independent Visualization of Supervised Data Structures Using a Graph

  • Hiroshi Tenmoto
  • Yasukuni Mori
  • Mineichi Kudo
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3138)

Abstract

Supervised data structures in high dimensional feature spaces are displayed as graphs. The structure is analyzed by normal mixture distributions. The nodes of the graph correspond the mean vectors of the mixture distributions, and the location is carried out by Sammon’s nonlinear mapping. The thickness of the edges expresses the separability between the component distributions, which is determined by Kullback-Leibler divergence. From experimental results, it was confirmed that the proposed method can illustrate in which regions and to what extent it is difficult to classify samples correctly. Such visual information can be utilized for the improvement of the feature sets.

Keywords

Feature Selection Method High Dimensional Feature Space Confusion Matrice Component Distribution Projection Pursuit 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Sammon, J.W.: A Nonlinear Mapping for Data Structure Analysis. IEEE Transactions on Computers 18, 401–409 (1969)CrossRefGoogle Scholar
  2. 2.
    Friedman, J.H., Tukey, J.W.: A Projection Pursuit Algorithm for Exploratory Data Analysis. IEEE Transactions on Computers 23, 881–889 (1974)zbMATHCrossRefGoogle Scholar
  3. 3.
    Aladjem, M.: Multiclass Discriminant Mappings. Signal Processing 35, 1–18 (1994)zbMATHCrossRefGoogle Scholar
  4. 4.
    Mao, J., Jain, A.K.: Artificial Neural Networks for Feature Extraction and Multivariate Data Projection. IEEE Transactions on Neural Networks 6, 296–317 (1995)CrossRefGoogle Scholar
  5. 5.
    Jain, A.K., Duin, P.W., Mao, J.: Statistical Pattern Recognition: A Review. IEEE Transactions on Pattern Analysis and Machine Intelligence 22, 4–37 (2000)CrossRefGoogle Scholar
  6. 6.
    Mori, Y., Kudo, M., Toyama, J., Shimbo, M.: Comparison of Low-Dimensional Mapping Techniques Based on Discriminary Information. In: Proceedings of the Second International ICSC Symposium on Advances in Intelligent Data Analysis (2001) (2001) ,CD-ROM Paper #1724-166Google Scholar
  7. 7.
    Tenmoto, H., Mori, Y., Kudo, M.: Visualization of Class Structures using Piecewise Linear Classifiers. Proceedings of Logic Applied to Technology (LAPTEC), 104–111 (2002)Google Scholar
  8. 8.
    Taylor, P.C., Hand, D.J.: Finding ’Superclassifications’ with an Acceptable Misclassification Rate. Journal of Applied Statistics 26, 579–590 Google Scholar
  9. 9.
    Kudo, M., Torii, Y., Mori, Y., Shimbo, M.: Approximation of Class Regions by Quasi Convex Hulls. Pattern Recognition Letters 19, 777–786 (1998)zbMATHCrossRefGoogle Scholar
  10. 10.
    Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum Likelihood from Incomplete Data via the EM Algorithm. Journal of the Royal Statistical Society, Series B 39, 1–38 (1977)zbMATHMathSciNetGoogle Scholar
  11. 11.
    Rissanen, J.: A Universal Prior for Integers and Estimation by Minimum Description Length. Annals of Statistics 11, 416–431 (1983)zbMATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Park, Y., Sklansky, J.: Automated Design of Multiple-Class Piecewise Linear Classifiers. Journal of Classification 6, 195–222 (1989)zbMATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Ueda, N.: EM Algorithm with Split and Merge Operations for Mixture Models (Invited). Transactions of IEICE E83-D, 2047–2785 (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Hiroshi Tenmoto
    • 1
  • Yasukuni Mori
    • 2
  • Mineichi Kudo
    • 3
  1. 1.Kushiro National College of TechnologyKushiro, HokkaidoJapan
  2. 2.Chiba UniversityChibaJapan
  3. 3.Hokkaido UniversitySapporoJapan

Personalised recommendations