Unsupervised Feature Selection for Noisy Data
- 709 Downloads
Feature selection techniques are enormously applied in a variety of data analysis tasks in order to reduce the dimensionality. According to the type of learning, feature selection algorithms are categorized to: supervised or unsupervised. In unsupervised learning scenarios, selecting features is a much harder problem, due to the lack of class labels that would facilitate the search for relevant features. The selecting feature difficulty is amplified when the data is corrupted by different noises. Almost all traditional unsupervised feature selection methods are not robust against the noise in samples. These approaches do not have any explicit mechanism for detaching and isolating the noise thus they can not produce an optimal feature subset. In this article, we propose an unsupervised approach for feature selection on noisy data, called Robust Independent Feature Selection (RIFS). Specifically, we choose feature subset that contains most of the underlying information, using the same criteria as the Independent component analysis (ICA). Simultaneously, the noise is separated as an independent component. The isolation of representative noise samples is achieved using factor oblique rotation whereas noise identification is performed using factor pattern loadings. Extensive experimental results over divers real-life data sets have showed the efficiency and advantage of the proposed algorithm.
KeywordsFeature selection Independent Component Analysis Oblique rotation Noise separation
We thankfully acknowledge the support of the Comision Interministerial de Ciencia y Tecnologa (CICYT) under contract No. TIN2015-65316-P which has partially funded this work.
- 1.Arai, K., Barakbah A. R.: Hierarchical K-means: an algorithm for centroids initialization for K-means. Reports of the Faculty of Science, Saga University, 36(1), pp. 25–31 (2007)Google Scholar
- 2.Cai, D., Zhang, C., He, X.: Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD (2010). https://doi.org/10.1145/1835804.1835848
- 4.Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn., pp. 400–401. Wiley-Interscience, Hoboken (2000)Google Scholar
- 7.He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: Advances in Neural Information Processing System, vol. 18, pp. 507–514 (2005)Google Scholar
- 8.Hendrickson, A.E., White, P.O.: PROMAX: a quick method for rotation to oblique simple structure. J. Stat. Psychol. 17(1), 65–70 (1964). https://doi.org/10.1111/j.2044-8317.1964.tb00244.xCrossRefGoogle Scholar
- 12.Lu, Y., Cohen, I., Zhou, X.S., Tian, Q.: Feature selection using principal feature analysis. In: Proceedings of the 15th ACM International Conference on Multimedia, pp. 301–304 (2007). https://doi.org/10.1145/1291233.1291297
- 13.Mancini, R., Carter, B.: Op Amps for Everyone. Texas Instruments, pp. 10–11 (2009). https://doi.org/10.1016/b978-1-85617-505-0.x0001-4
- 15.Qian, M., Zhai, C.: Robust unsupervised feature selection. In: Proceedings of the 23rd International Joint Conference on Artificial Intelligence, pp. 1621–1627 (2013)Google Scholar
- 17.Shlens, J.: A tutorial on principal component analysis. ArXiv preprint arXiv:1404.2986 (2014)
- 19.Zarzoso, V., Comon, P., Kallel, M.: How fast is FastICA? In: Proceedings of the 14th European Signal Processing Conference, pp. 1–5 (2006)Google Scholar
- 20.Zhao, Z., Liu, H.: Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th ICML, pp. 1151–1157 (2007). https://doi.org/10.1145/1273496.1273641