Least Squares and Estimation Measures via Error Correcting Output Code
It is known that the Error Correcting Output Code (ECOC) technique can improve generalisation for problems involving more than two classes. ECOC uses a strategy based on calculating distance to a class label in order to classify a pattern. However in some applications other kinds of information such as individual class probabilities can be useful. Least Squares(LS) is an alternative combination strategy to the standard distance based measure used in ECOC, but the effect of code specifications like the size of code or distance between labels has not been investigated in LS-ECOC framework. In this paper we consider constraints on choice of code matrix and express the relationship between final variance and local variance. Experiments on artificial and real data demonstrate that classification performance with LS can be comparable to the original distance based approach.
KeywordsHide Node Test Pattern Code Word Target Class Random Code
Unable to display preview. Download preview PDF.
- 1.E. Alpaydin and E. Mayoraz. Learning error-correcting output codes from data. In Proceeding of ICANN’99, Edinburgh, U.K., September 1999. http://www.cmpe.boun.edu.tr/ethem/.
- 2.A. Berger. Error-correcting output coding for text classification. In Proccedings of IJCAI’99, Stockholm, Sweden, 1999. http://proxy3.nj.nec.com/did/8956.
- 4.T.G Dietterich and G. Bakiri. Error-correcting output codes: A general method for improving multiclass inductive learning programs. In Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI-91), pages 572–577. AAAI Press, 1991.Google Scholar
- 6.R. Ghaderi and T. Windeatt. Circular ecoc, a theoretical and experimental analysis. In International Conference of Pattern Recognition(ICPR2000), pages 203–206, Barcelona, Spain, September 2000.Google Scholar
- 7.R. Ghaderi and T Windeatt. Viewpoints of error correcting output coding in classification task. In The 7th Electrical and electronic Engineering seminar of Iranian students in Europ., Manchester U.K, May 2000.Google Scholar
- 8.G. James. Majority Vote Classifiers: Theory and Applications. PhD thesis, Dept. of Statistics, Univ. of Stanford, May 1998. http://www-stat.stanford.edu/gareth/.
- 10.E.B. Kong and T.G. Diettrich. Error-correcting output coding correct bias and variance. In 12th Int. Conf. of Machine Learning, pages 313–321, San Fransisco, 1995. Morgan Kaufmann.Google Scholar
- 11.E.B. Kong and T.G. Diettrich. Probability estimation via error-correcting output coding. In Int. Conf. of Artificial Inteligence and soft computing, Banff, Canada, 1997. http://www.cs.orst.edu/tgd/cv/pubs.html.
- 12.F. Leisch and K. Hornik. Combining neural networks voting classifiers and error correcting output codes. In I. Frolla and A. Plakove, editors, MEASURMENT 97, pages 266–269, Smolenice, Slovakia, May 1997.Google Scholar
- 15.R.E. Schapire. Using output codes to boost multiclass learning problems. In 14th International Conf. on Machine Learning, pages 313–321. Morgan Kaufman, 1997.Google Scholar
- 16.T. Windeatt and R. Ghaderi. Binary codes for multi-class decision combining. In 14th Annual International Conference of Society of Photo-Optical Instrumentation Engineers (SPIE), volume 4051, pages 23–34, Florida, USA, April 2000.Google Scholar
- 17.T. Windeatt and R. Ghaderi. Multi-class learning and ecoc sensitivity. Electronics Letters, 36(19), September 2000.Google Scholar
- 18.T. Windeatt and R. Ghaderi. Binary labelling and decision level fusion. Information fusion, to be published, 2001.Google Scholar