Advertisement

Postsupervised Hard c-Means Classifier

  • Hidetomo Ichihashi
  • Katsuhiro Honda
  • Akira Notsu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4259)

Abstract

Miyamoto et al. derived a hard clustering algorithms by defuzzifying a generalized entropy-based fuzzy c-means in which covariance matrices are introduced as decision variables. We apply the hard c-means (HCM) clustering algorithms to a postsupervised classifier to improve resubstitution error rate by choosing best clustering results from local minima of an objective function. Due to the nature of the prototype based classifier, the error rates can easily be improved by increasing the number of clusters with the cost of computer memory and CPU speed. But, with the HCM classifier, the resubstitution error rate along with the data set compression ratio is improved on several benchmark data sets by using a small number of clusters for each class.

Keywords

Cluster Center Mahalanobis Distance Learn Vector Quantization Hard Cluster Iteratively Reweighted Little Square 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bezdek, J.C.: Pattern Recognition with Fuzzy Objective Function Algorithms. Plenum Press (1981)Google Scholar
  2. 2.
    Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. Wiley, New York (1973)MATHGoogle Scholar
  3. 3.
    Gustafson, E.E., Kessel, W.C.: Fuzzy clustering with a fuzzy covariance matrix. In: IEEE CDC, San Diego, California, pp. 761–766 (1979)Google Scholar
  4. 4.
    Holland, P.W., Welsch, R.E.: Robust Regression Using Iteratively Reweighted Least-squares. Communications in Statistics A6(9), 813–827 (1977)CrossRefGoogle Scholar
  5. 5.
    Huber, P.J.: Robust Statistics, 1st edn. Wiley, New York (1981)MATHCrossRefGoogle Scholar
  6. 6.
    Ichihashi, H., Miyagishi, K., Honda, K.: Fuzzy c-Means Clustering with Regularization by K-L Information. In: Proc. of 10th IEEE International Conference on Fuzzy Systems, Melboroune, Australia, vol. 3, pp. 924–927 (2001)Google Scholar
  7. 7.
    Ichihashi, H., Honda, K.: Fuzzy c-Means Classifier for Incomplete Data Sets with Outliers and Missing Values. In: Proc. of the International Conference on Computational Intelligence for Modelling, Control and Automation, Vienna, Austria, pp. 457–564 (2005)Google Scholar
  8. 8.
    Ichihashi, H., Honda, K., Hattori, T.: Regularized Discriminant in the Setting of Fuzzy c-Means Classifier. In: Proc. of the IEEE World Congress on Computational Intelligence, Vancouver, Canada (2006)Google Scholar
  9. 9.
    Ichihashi, H., Honda, K., Matsuura, F.: ROC Analysis of FCM Classifier With Cauchy Weight. In: Proc. of the 3rd International Conference on Soft Computing and Intelligent Systems, Tokyo, Japan (2006)Google Scholar
  10. 10.
    Krishnapuram, R., Keller, J.: A Possibilistic Approach to Clustering. IEEE Transactions on Fuzzy Systems 1, 98–110 (1993)CrossRefGoogle Scholar
  11. 11.
    Liu, Z.Q., Miyamoto, S. (eds.): Softcomputing and Human-Centered Machines. Springer, Heidelberg (2000)Google Scholar
  12. 12.
    Miyamoto, S., Yasukochi, T., Inokuchi, R.: A Family of Fuzzy and Defuzzified c-Means Algorithms. In: Proc. of the International Conference on Computational Intelligence for Modelling, Control and Automation, Vienna, Austria, pp. 170–176 (2005)Google Scholar
  13. 13.
    Miyamoto, S., Umayahara, K.: Fuzzy Clustering by Quadratic Regularization. In: Proc. of FUZZ-IEEE 1998, Anchorage, Alaska, pp. 1394–1399 (1998)Google Scholar
  14. 14.
    Miyamoto, S., Suizu, D., Takata, O.: Methods of Fuzzy c-Means and Possibilistic Clustering Using a Quadratic Term. Scientiae Mathematicae Japonicae 60(2), 217–233 (2004)MATHMathSciNetGoogle Scholar
  15. 15.
    Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes in C, The Art of Scientific Computing, 2nd edn. Cambridge University Press, Cambridge (1999)Google Scholar
  16. 16.
    Rose, K.: Deterministic Annealing for Clustering, Compression, Classification, Regression, and Related Optimization Problems. Proc. of the IEEE 86(11), 2210–2239 (1998)CrossRefGoogle Scholar
  17. 17.
    Tipping, M.E., Bishop, C.M.: Mixtures of Probabilistic Principal Component Analysers. Neural Computation 11, 443–482 (1999)CrossRefGoogle Scholar
  18. 18.
    Veenman, C.J., Reinders, M.J.T.: The Nearest Sub-class Classifier: A Compromise Between the Nearest Mean and Nearest Neighbor Classifier. IEEE Transactions on PAMI 27(9), 1417–1429 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Hidetomo Ichihashi
    • 1
  • Katsuhiro Honda
    • 1
  • Akira Notsu
    • 1
  1. 1.Graduate School of EngineeringOsaka Prefecture UniversitySakai, OsakaJapan

Personalised recommendations