Feature Weighting Algorithm Based on Margin and Linear Programming
Feature selection is an important task in machine learning. In this work, we design a robust algorithm for optimal feature subset selection. We present a global optimization technique for feature weighting. Margin induced loss functions are introduced to evaluate features, and we employs linear programming to search the optimal solution. The derived weights are combined with the nearest neighbor rule. The proposed technique is tested on UCI data sets. Compared with Simba and LMFW, the proposed technique is effective and efficient.
KeywordsFeature Selection Loss Function Class Label Feature Selection Method Feature Selection Algorithm
Unable to display preview. Download preview PDF.
- 1.Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection–theory and algorithms. In: Proceedings of the 21st International Conference on Machine Learning, p. 40 (2004)Google Scholar
- 2.Chen, M., Ebert, D., Hagen, H., Laramee, R.S.: Data Information and Knowledge in Visualization. Computer Graphics and Applications, 12–19 (2009)Google Scholar
- 3.Liu, C., Jaeger, S., Nakagawa, M.: Offline Recognition of Chinese Characters: the State of Art. IEEE Transcation on Pattern Analysis and Machine Intelligence 2, 198–213 (2004)Google Scholar
- 6.Kohavi, R., John, G.: Wrapper for feature subset selection. Artifical Intelligence, 234–273 (1997)Google Scholar
- 13.Sun, Y.: Iterative RELIEF for Feature Weighting: Algorithms,Theories, and Applications. IEEE Transations on Pattern Analysis and Machine Intelligence 6, 1–17 (2007)Google Scholar
- 14.Weinberger, K.Q., Blitzer, J., Saul, L.K.: Distance Metric Learning for Large Margin Nearest Neighbor Classification. Journal of Machine Learning Research, 207–244 (2009)Google Scholar
- 16.Merz, C.J., Merphy, P.: UCI repository of machine learning databases [OB/OL] (1996), http://www.ics.uci.edu/~mlearn/MLRRepository.html