Skip to main content

Feature Selection and Heterogeneous Descriptors

  • Chapter
Image Registration

Part of the book series: Advances in Computer Vision and Pattern Recognition ((ACVPR))

  • 4541 Accesses

Abstract

While the focus in Chap. 5 was on descriptors that were made up of homogeneous features, the focus in this chapter is on descriptors that are composed of features of different types. Heterogeneous descriptors are created from a combination of various types of features as described in Chap. 4. The features are selected in such a way that the combined feature set delivers the most information about an image or sub-image. Also discussed in this chapter are various feature-selection methods that choose the optimal or suboptimal feature set from a large number of features. Both filter and wrapper feature selection algorithms are discussed, including max-min, sequential forward selection, sequential backward selection, plus l take away r, and branch and bound algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aha, D.W., Bankert, R.L.: A comparative evaluation of sequential feature selection algorithms. In: Proc. 5th Int’l Workshop Artificial Intelligence and Statistics, pp. 1–7 (1995)

    Google Scholar 

  2. Backer, E., Schipper, J.A.D.: On the max-min approach for feature ordering and selection. In: Seminar Series on Pattern Recognition, Liège University, Sart-Tilman, Belgium, pp. 2.4.1–2.4.7 (1977)

    Google Scholar 

  3. Beale, E.M.L., Kendall, M.G., Mann, D.W.: The discarding of variables in multivariate analysis. Biometrika 53(3/4), 357–366 (1967)

    MathSciNet  Google Scholar 

  4. Bell, D.A., Wang, H.: A formalism for relevance and its application in feature subset selection. Mach. Learn. 41, 175–195 (2000)

    Article  MATH  Google Scholar 

  5. Bonnlander, B.V., Weigend, A.S.: Selecting input variables using mutual information and nonparametric density estimation. In: Proc. International Symposium on Artificial Neural Networks (ISANN), pp. 42–50 (1996)

    Google Scholar 

  6. Cover, T.M.: The best two independent measurements are not the two best. IEEE Trans. Syst. Man Cybern. 4(1), 116–117 (1974)

    MATH  Google Scholar 

  7. Das, S.K.: Feature selection with a linear dependence measure. IEEE Trans. Comput. 20(9), 1106–1109 (1971)

    Article  Google Scholar 

  8. Das, S.: Filters, wrappers and a boosting-based hybrid for feature selection. In: Proc. 18th Int’l Conf. Machine Learning, pp. 74–81 (2001)

    Google Scholar 

  9. Dash, M., Liu, H.: Feature selection for classification. Inell. Data Anal. 1, 131–156 (1997)

    Article  Google Scholar 

  10. Ding, C., Peng, H.C.: Minimum redundancy feature selection from microarray gene expression data. In: Proc. 2nd IEEE Conf. Computational Systems Bioinformatics, pp. 523–528 (2003)

    Google Scholar 

  11. Duncan, T.E.: On the calculation of mutual information. SIAM J. Appl. Math. 19(1), 215–220 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  12. Elashoff, J.D., Elashoff, R.M., Goldman, G.E.: On the choice of variables in classification problems with Dichotomous variables. Biometrika 54, 668–670 (1967)

    MathSciNet  Google Scholar 

  13. Hamamoto, Y., Uchimura, S., Matsunra, Y., Kanaoka, T., Tomita, S.: Evaluation of the branch and bound algorithm for feature selection. Pattern Recognit. Lett. 11, 453–456 (1990)

    Article  MATH  Google Scholar 

  14. Heydorn, R.P.: Redundancy in feature extraction. IEEE Trans. Comput. 20(9), 1051–1054 (1971)

    Article  MATH  Google Scholar 

  15. John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem. In: Proc. 11th Int’l Conf. Machine Learning, pp. 121–129 (1994)

    Google Scholar 

  16. Jolliffe, I.T.: Discarding variables in a principal component analysis. I: Artificial data. J. R. Stat. Soc., Ser. C, Appl. Stat. 21(2), 160–173 (1972)

    MathSciNet  Google Scholar 

  17. Kalousis, A., Prados, J., Hilario, M.: Stability of feature selection algorithms. In: Proc. 5th IEEE Int’l Conf. Data Mining, pp. 218–225 (2005)

    Chapter  Google Scholar 

  18. Ke, C.-H., Yang, C.-H., Chuang, L.-Y., Yang, C.-S.: A hybrid filter/wrapper approach of feature selection for gene expression data. In: IEEE Int’l Conf. Systems, Man and Cybernetics, pp. 2664–2670 (2008)

    Google Scholar 

  19. King, B.: Step-wise clustering procedures. J. Am. Stat. Assoc. 62(317), 86–101 (1967)

    Article  Google Scholar 

  20. Kittler, J.: Feature set search algorithms. In: Chen, C.H. (ed.) Pattern Recognition and Signal Processing, pp. 41–60 (1978)

    Chapter  Google Scholar 

  21. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97, 273–324 (1997)

    Article  MATH  Google Scholar 

  22. Koller, D., Sahami, M.: Toward optimal feature selection. In: Proc. 13th Int’l Conf. Machine Learning, pp. 284–292 (1996)

    Google Scholar 

  23. Křížek, P., Kittler, J., Hlaváč, V.: Improving stability of feature selection methods. In: Proc. 12th Int’l Conf. Computer Analysis of Images and Patterns, pp. 929–936 (2007)

    Google Scholar 

  24. Kuncheva, L.I.: A stability index for feature selection. In: Proc. 25th IASTED Int’l Multi-Conf. Artificial Intelligence and Applications, pp. 421–427 (2007)

    Google Scholar 

  25. Lawler, E.L., Wood, D.E.: Branch-and-bound methods: A survey. Oper. Res. 14(4), 699–719 (1966)

    Article  MathSciNet  MATH  Google Scholar 

  26. Lewis, P.M. II: Characteristic selection problem in recognition systems. IRE Trans. Inf. Theory 8(2), 171–178 (1962)

    Article  MATH  Google Scholar 

  27. Liu, H., Yu, L.: Toward integrating feature selection algorithms for classification and clustering. IEEE Trans. Knowl. Data Eng. 17(4), 491–502 (2005)

    Article  Google Scholar 

  28. Marill, T., Green, D.M.: On the effectiveness of receptors in recognition systems. IEEE Trans. Inf. Theory 9(1), 11–17 (1963)

    Article  Google Scholar 

  29. Michael, M., Lin, W.-C.: Experimental study of information measure and inter-intra class distance ratios on feature selection and ordering. IEEE Trans. Syst. Man Cybern. 3(2), 172–181 (1973)

    Google Scholar 

  30. Mitra, P., Murthy, C.A., Pal, S.K.: Unsupervised feature selection using feature similarity. IEEE Trans. Pattern Anal. Mach. Intell. 24(3), 301–312 (2002)

    Article  Google Scholar 

  31. Narendra, P.M., Fukunaga, K.: A branch and bound algorithm for feature subset selection. IEEE Trans. Comput. 26(9), 917–922 (1977)

    Article  MATH  Google Scholar 

  32. Ng, A.Y.: On feature selection: Learning with exponentially many irrelevant features as training examples. In: Proc. 15th Int’l Conf. Machine Learning, pp. 404–412 (1998)

    Google Scholar 

  33. Peng, H., Ding, C., Long, F.: Minimum redundancy maximum relevance feature selection. In: IEEE Intelligent Systems, Nov./Dec., pp. 70–71 (2005)

    Google Scholar 

  34. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)

    Article  Google Scholar 

  35. Pudil, P., Novovičová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recognit. Lett. 15(11), 1119–1125 (1994)

    Article  Google Scholar 

  36. Somol, P., Novovičová, J.: Evaluation stability and comparing output of feature selectors that optimize feature subset cardinality. IEEE Trans. Pattern Anal. Mach. Intell. 32(11), 1921–1939 (2010)

    Article  Google Scholar 

  37. Somol, P., Pudil, P., Novovičová, J., Paclík, P.: Adaptive floating search methods in feature selection. Pattern Recognit. Lett. 20, 1157–1163 (1999)

    Article  Google Scholar 

  38. Stearns, S.D.: On selecting features for pattern classifiers. In: 3rd Int’l Conf. Pattern Recognition, pp. 71–75 (1976)

    Google Scholar 

  39. Talavera, L.: Feature selection as a preprocessing step for hierarchical clustering. In: Proc. 16th Int’l Conf. Machine Learning, pp. 389–397 (1999)

    Google Scholar 

  40. Theodoridis, S., Koutroumbas, K.: Pattern Recognition, 4th edn. Academic Press, San Diego (2009), pp. 602, 605, 606

    Google Scholar 

  41. Toussaint, G.T.: Note on the optimal selection of independent binary features for pattern recognition. IEEE Trans. Inf. Theory 17(5), 618 (1971)

    Google Scholar 

  42. Toussaint, G.T., Vilmansen, T.R.: Comments on feature selection with a linear dependence measure. IEEE Trans. Comput. 21(4), 408 (1972)

    Article  Google Scholar 

  43. Vidal-Naquet, M., Ullman, S.: Object recognition with informative features and linear classification. In: Proc. Int’l Conf. Computer Vision, pp. 281–288 (2003)

    Chapter  Google Scholar 

  44. Wang, H., Bell, D., Murtagh, F.: Axiomatic approach to feature subset selection based on relevance. IEEE Trans. Pattern Anal. Mach. Intell. 21(3), 271–277 (1999)

    Article  Google Scholar 

  45. Wei, H.-L., Billings, S.A.: Feature subset selection and ranking for data dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 162–166 (2007)

    Article  Google Scholar 

  46. Whitney, A.: A direct method for nonparametric measurement selection. IEEE Trans. Comput. 20, 1100–1103 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  47. Xing, E., Jordan, M., Karp, R.: Feature selection for high-dimensional genomic microarray data. In: Proc. 15th Int’l Conf. Machine Learning, pp. 601–608 (2001)

    Google Scholar 

  48. Yu, L., Liu, H.: Efficiently handling feature redundancy in high-dimensional data. In: Proc. ACM SIGKDD Int’l Conf. Knowledge Discovery and Data Mining (KDD), pp. 685–690 (2003)

    Google Scholar 

  49. Yu, L., Liu, H.: Feature selection for high-dimensional data: A fast correlation-based filter solution. In: Proc. 20th Int’l Conf. Machine Learning, pp. 856–863 (2003)

    Google Scholar 

  50. Yu, L., Liu, H.: Efficient feature selection via analysis of relevance and redundancy. J. Mach. Learn. Res. 5, 1205–1224 (2004)

    MATH  Google Scholar 

  51. Yu, B., Yuan, B.: A more efficient branch and bound algorithm for feature selection. Pattern Recognit. 26(6), 883–889 (1993)

    Article  MathSciNet  Google Scholar 

  52. Zhu, Z., Ong, Y.-S., Dash, M.: Wrapper-filter feature selection algorithm using a memetic framework. IEEE Trans. Syst. Man Cybern., Part B, Cybern. 37(1), 70–76 (2007)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. Ardeshir Goshtasby .

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag London Limited

About this chapter

Cite this chapter

Goshtasby, A.A. (2012). Feature Selection and Heterogeneous Descriptors. In: Image Registration. Advances in Computer Vision and Pattern Recognition. Springer, London. https://doi.org/10.1007/978-1-4471-2458-0_6

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-2458-0_6

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-2457-3

  • Online ISBN: 978-1-4471-2458-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics