Advertisement

Distance Function Learning in Error-Correcting Output Coding Framework

  • Dijun Luo
  • Rong Xiong
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4233)

Abstract

This paper presents a novel framework of error-correcting output coding (ECOC) addressing the problem of multi-class classification. By weighting the output space of each base classifier which is trained independently, the distance function of decoding is adapted so that the samples are more discriminative. A criterion generated over the Extended Pair Samples (EPS) is proposed to train the weights of output space. Some properties still hold in the new framework: any classifier, as well as distance function, is still applicable. We first conduct empirical studies on UCI datasets to verify the presented framework with four frequently used coding matrixes and then apply it in RoboCup domain to enhance the performance of agent control. Experimental results show that our supervised learned decoding scheme improves the accuracy of classification significantly and betters the ball control of agents in a soccer game after learning from experience.

Keywords

Distance Function Output Code Soccer Game Robot Soccer Optimal Decode 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aha, D.W.: Cloud classification using error-correcting output codes. Artificial Intelligence Applications: Natural Science, Agriculture, and Environmental Science 11, 13–28 (1997)Google Scholar
  2. 2.
    Allwein, E., Schapire, R., Singer, Y.: Reducing multiclass to binary: A unifying approach for margin classifiers. In: Machine Learning: Proceedings of the Seventeenth International Conference. Artificial Intelligence Research, vol. 2, pp. 263–286 (2000)Google Scholar
  3. 3.
    Berger, A.: Error-correcting output coding for text classification. In: IJCAI 1999: Workshop on Machine Learning for Information Filtering, Springer, Berlin (1999)Google Scholar
  4. 4.
    Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and regression trees. Wadsworth & Brooks, Belmont, CA (1984)MATHGoogle Scholar
  5. 5.
    Crammer, K., Singer, Y.: On the Learnability and Design of Output Codes for Multiclass Problems. Machine Learning 47(2-3), 201–233 (2002)MATHCrossRefGoogle Scholar
  6. 6.
    Crammer, K., Singer, Y.: On the algorithmic implementation of multiclass kernel-based machines. Journal of Machine Learning Research 2, 265–292 (2001)CrossRefGoogle Scholar
  7. 7.
    Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research 2, 263–286 (1995)MATHGoogle Scholar
  8. 8.
    Dietterich, T., Kong, E.B.: Machine learning bias, statistical bias, and statistical variance of decision tree algorithms. Technical report, Oregon State University (1995), available via the WWW at http://www.cs.orst.edu:80/~tgd/cv/tr.html
  9. 9.
    Hastie, T., Tibshirani, R.: Classification by pairwise coupling. In: Advances in Neural Information Processing Systems, vol. 10, MIT Press, Cambridge (1998)Google Scholar
  10. 10.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: data mining, inference and prediction. Springer, Heidelberg (2001)MATHGoogle Scholar
  11. 11.
    Hsu, C.-W., Lin, C.-J.: A comparison of methods for multi-class support vector machines. IEEE Transactions on Neural Networks 13, 415–425 (2002)CrossRefGoogle Scholar
  12. 12.
    Joachims, T.: Optimizing Search Engines Using Clickthrough Data. In: Proceedings of the ACM Conference on Knowledge Discovery and Data Mining (KDD), ACM, New York (2002)Google Scholar
  13. 13.
    Kuhlmann, G., Stone, P.: Progress in Learning 3 vs. 2 Keepaway. In: Polani, D., Browning, B., Bonarini, A., Yoshida, K. (eds.) RoboCup 2003. LNCS (LNAI), vol. 3020, pp. 694–702. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  14. 14.
    Passerini, A., Pontil, M., Frasconi, P.: New results on error correcting output codes of kernel machines. IEEE Transactions on Neural Networks 15(1), 45–54 (2004)CrossRefGoogle Scholar
  15. 15.
    Passerini, A., Pontil, M., Frasconi, F.: From Margins to Probabilities in Multiclass Learning Problems. In: ECAI, pp. 400–404 (2002)Google Scholar
  16. 16.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo (1993)Google Scholar
  17. 17.
    Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation, ch. 8. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel distributed processing-explorations in the microstructure of cognition, pp. 318–362. MIT Press, Cambridge, MA (1986)Google Scholar
  18. 18.
    Schapire, R.E.: Using output codes to boost multiclass learning problems. In: Machine Learning. Proceedings of the Fourteenth International Conference, pp. 313–321 (1997)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Dijun Luo
    • 1
  • Rong Xiong
    • 1
  1. 1.National Lab of Industrial Control TechnologyZhejiang UniversityChina

Personalised recommendations