Automatic Window Design for Gray-Scale Image Processing Based on Entropy Minimization

  • David C. MartinsJr.
  • Roberto M. CesarJr.
  • Junior Barrera
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3773)

Abstract

This paper generalizes the technique described in [1] to gray-scale image processing applications. This method chooses a subset of variables W (i.e. pixels seen through a window) that maximizes the information observed in a set of training data by mean conditional entropy minimization. The task is formalized as a combinatorial optimization problem, where the search space is the powerset of the candidate variables and the measure to be minimized is the mean entropy of the estimated conditional probabilities. As a full exploration of the search space requires an enormous computational effort, some heuristics of the feature selection literature are applied. The introduced approach is mathematically sound and experimental results with texture recognition application show that it is also adequate to treat problems with gray-scale images.

Keywords

Feature Vector Feature Selection Mutual Information Conditional Entropy Feature Selection Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Martins Jr., D.C., Cesar Jr., R.M., Barrera, J.: W-operator window design by maximization of training data information. In: Proceedings of XVII Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI), pp. 162–169. IEEE Computer Society Press, Los Alamitos (2004)CrossRefGoogle Scholar
  2. 2.
    Barrera, J., Terada, R., Hirata-Jr., R., Hirata, N.S.T.: Automatic programming of morphological machines by pac learning. Fundamenta Informaticae, 229–258 (2000)Google Scholar
  3. 3.
    Dougherty, E.R., Barrera, J., Mozelle, G., Kim, S., Brun, M.: Multiresolution analysis for optimal binary filters. J. Math. Imaging Vis. 14(1), 53–72 (2001)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Pudil, P., Novovicová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recognition Letters 15, 1119–1125 (1994)CrossRefGoogle Scholar
  5. 5.
    Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)MATHMathSciNetGoogle Scholar
  6. 6.
    Cover, T.M., Thomas, J.A.: Elements of information theory. In: Wiley Series in Telecommunications. John Wiley & Sons, New York (1991)Google Scholar
  7. 7.
    Kullback, S.: Information Theory and Statistics. Dover (1968)Google Scholar
  8. 8.
    Soofi, E.S.: Principal information theoretic approaches. Journal of the American Statistical Association 95, 1349–1353 (2000)MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Duda, R.O., Hart, P.E., Stork, D.: Pattern Classification. John Wiley & Sons, NY (2000)Google Scholar
  10. 10.
    Hall, M.A., Smith, L.A.: Feature selection for machine learning: Comparing a correlation-based filter approach to the wrapper. In: Proc. FLAIRS Conference, pp. 235–239. AAAI Press, Menlo Park (1999)Google Scholar
  11. 11.
    Lewis, D.D.: Feature selection and feature extraction for text categorization. In: Proceedings of Speech and Natural Language Workshop, San Mateo, California, pp. 212–217. Morgan Kaufmann, San Francisco (1992)CrossRefGoogle Scholar
  12. 12.
    Bonnlander, B.V., Weigend, A.S.: Selecting input variables using mutual information and nonparametric density estimation. In: Proc. of the 1994 Int. Symp. on Artificial Neural Networks, Tainan, Taiwan, pp. 42–50 (1994)Google Scholar
  13. 13.
    Viola, P., Wells III, W.M.: Alignment by maximization of mutual information. Int. J. Comput. Vision 24(2), 137–154 (1997)CrossRefGoogle Scholar
  14. 14.
    Zaffalon, M., Hutter, M.: Robust feature selection by mutual information distributions. In: 18th International Conference on Uncertainty in Artificial Intelligence (UAI), pp. 577–584 (2002)Google Scholar
  15. 15.
    Campos, T.E., Bloch, I., Cesar Jr., R.M.: Feature selection based on fuzzy distances between clusters: First results on simulated data. In: Singh, S., Murshed, N., Kropatsch, W.G. (eds.) ICAPR 2001. LNCS, vol. 2013, pp. 186–195. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  16. 16.
    Jain, A., Zongker, D.: Feature selection - evaluation, application, and small sample performance. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(2), 153–158 (1997)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • David C. MartinsJr.
    • 1
  • Roberto M. CesarJr.
    • 1
  • Junior Barrera
    • 1
  1. 1.IME–Instituto de Matemática e Estatística, Computer Science DepartmentUSP–Universidade de São PauloSão PauloBrasil

Personalised recommendations