Abstract
Approaches based on obtaining relevant information from overwhelmingly large sets of measures have been recently adopted as an alternative to specialized features. In this work, we address the problem of finding a relevant subset of features and a suitable rotation (combined feature selection and feature extraction) as a weighted rotation. We focus our attention on two types of rotations: Weighted Principal Component Analysis and Weighted Regularized Discriminant Analysis. The objective function is the maximization of the J4 ratio. Tests were carried out on artificially generated classes, with several non-relevant features. Real data tests were also performed on segmentation of naildfold capillaroscopic images, and NIST-38 database (prototype selection).
This research has been made under grant of the project titled ”Técnicas de Computación de Alto Rendimiento en la Interpretación Automatizada de Imágenes Médicas y Biosen̈ales”. DIMA, Universidad Nacional de Colombia Sede Manizales.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Skočaj, D., Leonardis, A.: Weighted incremental subspace learning. In: Workshop on Cognitive Vision, Zurich (September 2002)
Jolliffe, I.T.: Principal Component Analysis, 2nd edn. Springer Series in Statistics. Springer, Heidelberg (2002)
Jebara, T., Jaakkola, T.: Feature selection and dualities in maximum entropy discrimination. In: 16th Conference on Uncertainty in Artificial Intelligence (2000)
Weston, J., Mukherjee, S., Chapelle, O., Pontill, M., Poggio, T., Vapnik, V.: Feature selection for svms. In: NIPS (2001)
Webb, A.R.: Statistical Pattern Recognition, 2nd edn. John Willey and Sons, West Sussex, England (2002)
Blum, A.L., Langley, P.: Selection of relevant features and examples in machine learning. AI 97(1-2) (1997)
Turk, M.A., Pentland, A.P.: Face recognition using eigenfaces. In: Computer Vision and Pattern Recognition (1991)
Tipping, M., Bishop, C.: Probabilistic principal component analysis. J. R. Statistics Society 61, 611–622 (1999)
Roweis, S.: Em algorithms for pca and spca. Neural Information Processing Systems 10, 626–632 (1997)
Friedman, J.H.: Regularized discriminant analysis. Journal of the American Statistical Association 84, 165–175 (1989)
Wolf, L., Shashua, A.: Feature selection for unsupervised and supervised inference: the emergence of sparsity in a weighted-based approach. Journal of Machine Learning Research (2005)
Direct feature selection with implicit inference. In: ICCV (2003)
Li, S., Fevens, T., Krzyzak, A., Li, S.: Automatic clinical image segmentation using pathological modelling, pca and svm. In: Perner, P., Imiya, A. (eds.) MLDM 2005. LNCS (LNAI), vol. 3587, Springer, Heidelberg (2005)
Pekalska, E., Duin, R.P.W., PaclÌk, P.: Prototype selection for dissimilarity-based classifiers. Elsevier, Pattern Recognition 39 (2006)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sánchez, L., Martínez, F., Castellanos, G., Salazar, A. (2007). Feature Extraction of Weighted Data for Implicit Variable Selection. In: Kropatsch, W.G., Kampel, M., Hanbury, A. (eds) Computer Analysis of Images and Patterns. CAIP 2007. Lecture Notes in Computer Science, vol 4673. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74272-2_104
Download citation
DOI: https://doi.org/10.1007/978-3-540-74272-2_104
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74271-5
Online ISBN: 978-3-540-74272-2
eBook Packages: Computer ScienceComputer Science (R0)