Abstract
While the focus in Chap. 5 was on descriptors that were made up of homogeneous features, the focus in this chapter is on descriptors that are composed of features of different types. Heterogeneous descriptors are created from a combination of various types of features as described in Chap. 4. The features are selected in such a way that the combined feature set delivers the most information about an image or sub-image. Also discussed in this chapter are various feature-selection methods that choose the optimal or suboptimal feature set from a large number of features. Both filter and wrapper feature selection algorithms are discussed, including max-min, sequential forward selection, sequential backward selection, plus l take away r, and branch and bound algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Aha, D.W., Bankert, R.L.: A comparative evaluation of sequential feature selection algorithms. In: Proc. 5th Int’l Workshop Artificial Intelligence and Statistics, pp. 1–7 (1995)
Backer, E., Schipper, J.A.D.: On the max-min approach for feature ordering and selection. In: Seminar Series on Pattern Recognition, Liège University, Sart-Tilman, Belgium, pp. 2.4.1–2.4.7 (1977)
Beale, E.M.L., Kendall, M.G., Mann, D.W.: The discarding of variables in multivariate analysis. Biometrika 53(3/4), 357–366 (1967)
Bell, D.A., Wang, H.: A formalism for relevance and its application in feature subset selection. Mach. Learn. 41, 175–195 (2000)
Bonnlander, B.V., Weigend, A.S.: Selecting input variables using mutual information and nonparametric density estimation. In: Proc. International Symposium on Artificial Neural Networks (ISANN), pp. 42–50 (1996)
Cover, T.M.: The best two independent measurements are not the two best. IEEE Trans. Syst. Man Cybern. 4(1), 116–117 (1974)
Das, S.K.: Feature selection with a linear dependence measure. IEEE Trans. Comput. 20(9), 1106–1109 (1971)
Das, S.: Filters, wrappers and a boosting-based hybrid for feature selection. In: Proc. 18th Int’l Conf. Machine Learning, pp. 74–81 (2001)
Dash, M., Liu, H.: Feature selection for classification. Inell. Data Anal. 1, 131–156 (1997)
Ding, C., Peng, H.C.: Minimum redundancy feature selection from microarray gene expression data. In: Proc. 2nd IEEE Conf. Computational Systems Bioinformatics, pp. 523–528 (2003)
Duncan, T.E.: On the calculation of mutual information. SIAM J. Appl. Math. 19(1), 215–220 (1970)
Elashoff, J.D., Elashoff, R.M., Goldman, G.E.: On the choice of variables in classification problems with Dichotomous variables. Biometrika 54, 668–670 (1967)
Hamamoto, Y., Uchimura, S., Matsunra, Y., Kanaoka, T., Tomita, S.: Evaluation of the branch and bound algorithm for feature selection. Pattern Recognit. Lett. 11, 453–456 (1990)
Heydorn, R.P.: Redundancy in feature extraction. IEEE Trans. Comput. 20(9), 1051–1054 (1971)
John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem. In: Proc. 11th Int’l Conf. Machine Learning, pp. 121–129 (1994)
Jolliffe, I.T.: Discarding variables in a principal component analysis. I: Artificial data. J. R. Stat. Soc., Ser. C, Appl. Stat. 21(2), 160–173 (1972)
Kalousis, A., Prados, J., Hilario, M.: Stability of feature selection algorithms. In: Proc. 5th IEEE Int’l Conf. Data Mining, pp. 218–225 (2005)
Ke, C.-H., Yang, C.-H., Chuang, L.-Y., Yang, C.-S.: A hybrid filter/wrapper approach of feature selection for gene expression data. In: IEEE Int’l Conf. Systems, Man and Cybernetics, pp. 2664–2670 (2008)
King, B.: Step-wise clustering procedures. J. Am. Stat. Assoc. 62(317), 86–101 (1967)
Kittler, J.: Feature set search algorithms. In: Chen, C.H. (ed.) Pattern Recognition and Signal Processing, pp. 41–60 (1978)
Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97, 273–324 (1997)
Koller, D., Sahami, M.: Toward optimal feature selection. In: Proc. 13th Int’l Conf. Machine Learning, pp. 284–292 (1996)
Křížek, P., Kittler, J., Hlaváč, V.: Improving stability of feature selection methods. In: Proc. 12th Int’l Conf. Computer Analysis of Images and Patterns, pp. 929–936 (2007)
Kuncheva, L.I.: A stability index for feature selection. In: Proc. 25th IASTED Int’l Multi-Conf. Artificial Intelligence and Applications, pp. 421–427 (2007)
Lawler, E.L., Wood, D.E.: Branch-and-bound methods: A survey. Oper. Res. 14(4), 699–719 (1966)
Lewis, P.M. II: Characteristic selection problem in recognition systems. IRE Trans. Inf. Theory 8(2), 171–178 (1962)
Liu, H., Yu, L.: Toward integrating feature selection algorithms for classification and clustering. IEEE Trans. Knowl. Data Eng. 17(4), 491–502 (2005)
Marill, T., Green, D.M.: On the effectiveness of receptors in recognition systems. IEEE Trans. Inf. Theory 9(1), 11–17 (1963)
Michael, M., Lin, W.-C.: Experimental study of information measure and inter-intra class distance ratios on feature selection and ordering. IEEE Trans. Syst. Man Cybern. 3(2), 172–181 (1973)
Mitra, P., Murthy, C.A., Pal, S.K.: Unsupervised feature selection using feature similarity. IEEE Trans. Pattern Anal. Mach. Intell. 24(3), 301–312 (2002)
Narendra, P.M., Fukunaga, K.: A branch and bound algorithm for feature subset selection. IEEE Trans. Comput. 26(9), 917–922 (1977)
Ng, A.Y.: On feature selection: Learning with exponentially many irrelevant features as training examples. In: Proc. 15th Int’l Conf. Machine Learning, pp. 404–412 (1998)
Peng, H., Ding, C., Long, F.: Minimum redundancy maximum relevance feature selection. In: IEEE Intelligent Systems, Nov./Dec., pp. 70–71 (2005)
Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)
Pudil, P., Novovičová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recognit. Lett. 15(11), 1119–1125 (1994)
Somol, P., Novovičová, J.: Evaluation stability and comparing output of feature selectors that optimize feature subset cardinality. IEEE Trans. Pattern Anal. Mach. Intell. 32(11), 1921–1939 (2010)
Somol, P., Pudil, P., Novovičová, J., Paclík, P.: Adaptive floating search methods in feature selection. Pattern Recognit. Lett. 20, 1157–1163 (1999)
Stearns, S.D.: On selecting features for pattern classifiers. In: 3rd Int’l Conf. Pattern Recognition, pp. 71–75 (1976)
Talavera, L.: Feature selection as a preprocessing step for hierarchical clustering. In: Proc. 16th Int’l Conf. Machine Learning, pp. 389–397 (1999)
Theodoridis, S., Koutroumbas, K.: Pattern Recognition, 4th edn. Academic Press, San Diego (2009), pp. 602, 605, 606
Toussaint, G.T.: Note on the optimal selection of independent binary features for pattern recognition. IEEE Trans. Inf. Theory 17(5), 618 (1971)
Toussaint, G.T., Vilmansen, T.R.: Comments on feature selection with a linear dependence measure. IEEE Trans. Comput. 21(4), 408 (1972)
Vidal-Naquet, M., Ullman, S.: Object recognition with informative features and linear classification. In: Proc. Int’l Conf. Computer Vision, pp. 281–288 (2003)
Wang, H., Bell, D., Murtagh, F.: Axiomatic approach to feature subset selection based on relevance. IEEE Trans. Pattern Anal. Mach. Intell. 21(3), 271–277 (1999)
Wei, H.-L., Billings, S.A.: Feature subset selection and ranking for data dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 162–166 (2007)
Whitney, A.: A direct method for nonparametric measurement selection. IEEE Trans. Comput. 20, 1100–1103 (1971)
Xing, E., Jordan, M., Karp, R.: Feature selection for high-dimensional genomic microarray data. In: Proc. 15th Int’l Conf. Machine Learning, pp. 601–608 (2001)
Yu, L., Liu, H.: Efficiently handling feature redundancy in high-dimensional data. In: Proc. ACM SIGKDD Int’l Conf. Knowledge Discovery and Data Mining (KDD), pp. 685–690 (2003)
Yu, L., Liu, H.: Feature selection for high-dimensional data: A fast correlation-based filter solution. In: Proc. 20th Int’l Conf. Machine Learning, pp. 856–863 (2003)
Yu, L., Liu, H.: Efficient feature selection via analysis of relevance and redundancy. J. Mach. Learn. Res. 5, 1205–1224 (2004)
Yu, B., Yuan, B.: A more efficient branch and bound algorithm for feature selection. Pattern Recognit. 26(6), 883–889 (1993)
Zhu, Z., Ong, Y.-S., Dash, M.: Wrapper-filter feature selection algorithm using a memetic framework. IEEE Trans. Syst. Man Cybern., Part B, Cybern. 37(1), 70–76 (2007)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2012 Springer-Verlag London Limited
About this chapter
Cite this chapter
Goshtasby, A.A. (2012). Feature Selection and Heterogeneous Descriptors. In: Image Registration. Advances in Computer Vision and Pattern Recognition. Springer, London. https://doi.org/10.1007/978-1-4471-2458-0_6
Download citation
DOI: https://doi.org/10.1007/978-1-4471-2458-0_6
Publisher Name: Springer, London
Print ISBN: 978-1-4471-2457-3
Online ISBN: 978-1-4471-2458-0
eBook Packages: Computer ScienceComputer Science (R0)