Least Squares and Estimation Measures via Error Correcting Output Code

  • Reza Ghaderi
  • Terry Windeatt
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2096)


It is known that the Error Correcting Output Code (ECOC) technique can improve generalisation for problems involving more than two classes. ECOC uses a strategy based on calculating distance to a class label in order to classify a pattern. However in some applications other kinds of information such as individual class probabilities can be useful. Least Squares(LS) is an alternative combination strategy to the standard distance based measure used in ECOC, but the effect of code specifications like the size of code or distance between labels has not been investigated in LS-ECOC framework. In this paper we consider constraints on choice of code matrix and express the relationship between final variance and local variance. Experiments on artificial and real data demonstrate that classification performance with LS can be comparable to the original distance based approach.


Hide Node Test Pattern Code Word Target Class Random Code 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    E. Alpaydin and E. Mayoraz. Learning error-correcting output codes from data. In Proceeding of ICANN’99, Edinburgh, U.K., September 1999.
  2. 2.
    A. Berger. Error-correcting output coding for text classification. In Proccedings of IJCAI’99, Stockholm, Sweden, 1999.
  3. 3.
    C.L. Blake and C.J. Merz. UCI repository of machine learning databases. University of California, Irvine, Dept. of Information and Computer Sciences, 1998. Scholar
  4. 4.
    T.G Dietterich and G. Bakiri. Error-correcting output codes: A general method for improving multiclass inductive learning programs. In Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI-91), pages 572–577. AAAI Press, 1991.Google Scholar
  5. 5.
    T.G. Dietterich and G Bakiri. Solving multi-class learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2:263–286, 1995.zbMATHGoogle Scholar
  6. 6.
    R. Ghaderi and T. Windeatt. Circular ecoc, a theoretical and experimental analysis. In International Conference of Pattern Recognition(ICPR2000), pages 203–206, Barcelona, Spain, September 2000.Google Scholar
  7. 7.
    R. Ghaderi and T Windeatt. Viewpoints of error correcting output coding in classification task. In The 7th Electrical and electronic Engineering seminar of Iranian students in Europ., Manchester U.K, May 2000.Google Scholar
  8. 8.
    G. James. Majority Vote Classifiers: Theory and Applications. PhD thesis, Dept. of Statistics, Univ. of Stanford, May 1998.
  9. 9.
    G. James and T. Hastie. The error coding method and pict’s. Computational and Graphical Statistics, 7:377–387, 1998.CrossRefMathSciNetGoogle Scholar
  10. 10.
    E.B. Kong and T.G. Diettrich. Error-correcting output coding correct bias and variance. In 12th Int. Conf. of Machine Learning, pages 313–321, San Fransisco, 1995. Morgan Kaufmann.Google Scholar
  11. 11.
    E.B. Kong and T.G. Diettrich. Probability estimation via error-correcting output coding. In Int. Conf. of Artificial Inteligence and soft computing, Banff, Canada, 1997.
  12. 12.
    F. Leisch and K. Hornik. Combining neural networks voting classifiers and error correcting output codes. In I. Frolla and A. Plakove, editors, MEASURMENT 97, pages 266–269, Smolenice, Slovakia, May 1997.Google Scholar
  13. 13.
    F Masulli and G Valentini. Effectiveness of error correcting output codes in multiclass learning problems. In J. Kittler and F. Roli, editors, Multiple Classifier Systems, MCS2000, pages 107–116, Cagliari, Italy, 2000. Springer Lecture Notes in Computer Science.CrossRefGoogle Scholar
  14. 14.
    W.W. Peterson and JR. Weldon. Error-Correcting Codes. MIT press, Cambridge, MA, 1972.zbMATHGoogle Scholar
  15. 15.
    R.E. Schapire. Using output codes to boost multiclass learning problems. In 14th International Conf. on Machine Learning, pages 313–321. Morgan Kaufman, 1997.Google Scholar
  16. 16.
    T. Windeatt and R. Ghaderi. Binary codes for multi-class decision combining. In 14th Annual International Conference of Society of Photo-Optical Instrumentation Engineers (SPIE), volume 4051, pages 23–34, Florida, USA, April 2000.Google Scholar
  17. 17.
    T. Windeatt and R. Ghaderi. Multi-class learning and ecoc sensitivity. Electronics Letters, 36(19), September 2000.Google Scholar
  18. 18.
    T. Windeatt and R. Ghaderi. Binary labelling and decision level fusion. Information fusion, to be published, 2001.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Reza Ghaderi
    • 1
  • Terry Windeatt
    • 1
  1. 1.Centre for Vision, Speech and Signal Processing (CVSSP)University of SurreyGuildfordUK

Personalised recommendations