Feature Subset Selection Using an Optimized Hill Climbing Algorithm for Handwritten Character Recognition

  • Carlos M. Nunes
  • Alceu de S. BrittoJr.
  • Celso A. A. Kaestner
  • Robert Sabourin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3138)

Abstract

This paper presents an optimized Hill Climbing algorithm to select a subset of features for handwritten character recognition. The search is conducted taking into account a random mutation strategy and the initial relevance of each feature in the recognition process. The experiments have shown a reduction in the original number of features used in an MLP-based character recognizer from 132 to 77 features (reduction of 42%) without a significant loss in terms of recognition rates, which are 99% for 60,089 samples of digits, and 93% for 11,941 uppercase characters, both handwritten samples from the NIST SD19 database. The proposed method has shown to be an interesting strategy to implement a wrapper approach without the need of complex and expensive hardware architectures.

References

  1. 1.
    Jain, A., Duin, R.P.W., Mao, J.: Statistical Pattern Recognition: A Review. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(1), 4–37 (2000)CrossRefGoogle Scholar
  2. 2.
    Boz, O.: Feature Subset Selection by Using Sorted Feature Relevance. In: ICMLA International Conference on Machine Learning and Application, Las Vegas City (2002)Google Scholar
  3. 3.
    Molina, L.C., Belanche, L., Nebot, À.: Feature Selection Algorithms: Survey Experimental and Evaluation. In: IEEE International Conference on Data Mining, Maebashi City, vol. 1, pp. 306–313 (2002)Google Scholar
  4. 4.
    Blum, A.L., Langlev, P.: Selection of Relevant Features and Examples in Machine Learning. special issue Artificial Intelligence 97(1-2), 245–271 (1997)MATHGoogle Scholar
  5. 5.
    Aha, D.W., Bankert, R.L.: Feature Selection for Case-based Classification of Cloud Types: an empirical comparison. In: Working Notes of the AAAI, Workshop on Case-Based Reasoning, vol. 1, pp. 106–112 (1994)Google Scholar
  6. 6.
    Raman, B., Ioerger, T.R.: Enhancing Learning Using Feature and Example Selection. Journal of Machine Learning Research (2003) (submitted for publication)Google Scholar
  7. 7.
    Kohavi, Ron, John, George, H.: Wrappers for Feature Subset Selection. Artificial Intelligence Journal, Special Issue On Relevance 97(1), 273–324 (1997)MATHGoogle Scholar
  8. 8.
    Oliveira, L.S., Sabourin, R., Bortolozzi, F., Suen, C.Y.: Feature Selection Using Multiobjective Genetic Algorithms for Handwritten Digit Recognition. In: Proc. International Conference on Pattern Recognition, Quebec City, vol. 1, pp. 568–571 (2002)Google Scholar
  9. 9.
    Oliveira, L.S., Sabourin, R., Bortolozzi, F., Suen, C.Y.: A Methodology for Feature Selection Using Multi-objective Genetic Algorithms for Handwritten Digit String Recognition. In: The International Journal of Pattern Recognition and Artificial Intelligence (IJPRAI), vol. 17(6), pp. 903–930 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Carlos M. Nunes
    • 1
  • Alceu de S. BrittoJr.
    • 1
    • 2
  • Celso A. A. Kaestner
    • 1
  • Robert Sabourin
    • 3
  1. 1.Pontifícia Universidade Católica do Paraná (PUC-PR)CuritibaBrazil
  2. 2.Universidade Estadual de Ponta Grossa (UEPG)Ponta GrossaBrazil
  3. 3.École de Technologie Supérieure (ETS)MontrealCanada

Personalised recommendations