Advertisement

RETRACTED CHAPTER: Local Feature Weighting for Data Classification

  • Gengyun Jia
  • Haiying ZhaoEmail author
  • Zhigeng Pan
  • Liangliang Wang
Chapter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10092)

Abstract

Feature weighting is an important task in data analyze, clustering and classification. Traditional algorithms focus on a common weight vector on the whole dataset which can easily lead to sensitiveness to the distribution of data. In contrast, a novel feature weighting algorithm called local feature weighting (LFW) that assign each sample a unique weight vector is proposed in this paper. We use clustering assumption to construct optimization task. Instead of considering the total intra-class and between-class features, we focus on the clustering performance on each training sample and the optimization goals are to minimize the total distances of a training sample to others in the same class and maximize the total distances in different classes. Data weight is added to the target function to emphasis nearby samples and finally use an iterative process to solve our problem. Experiments show that the new algorithm has a good performance on data classification. In addition, we provide a simple version of LFW which has less running time but with little accuracy loss.

Keywords

Local feature weighting Classification Clustering assumption 

References

  1. 1.
    Armanfard, N., Reilly, J.P., Komeili, M.: Local feature selection for data classification. IEEE Trans. Pattern Anal. Mach. Intell. 38(6), 1217–1227 (2016)CrossRefGoogle Scholar
  2. 2.
    Peng, H., Long, F., Ding, C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)CrossRefGoogle Scholar
  3. 3.
    Tahir, M.A., Bouridane, A., Kurugollu, F.: Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier. Pattern Recogn. Lett. 28(4), 438–446 (2007)CrossRefGoogle Scholar
  4. 4.
    Huang, J.Z., Ng, M.K., Rong, H., Li, Z.: Automated variable weighting in k-means type clustering. IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 657–668 (2005)CrossRefGoogle Scholar
  5. 5.
    Saha, A., Das, S.: Categorical fuzzy k-modes clustering with automated feature weight learning. Neurocomputing 166, 422–435 (2015)CrossRefGoogle Scholar
  6. 6.
    Wang, L.: Feature selection with kernel class separability. IEEE Trans. Pattern Anal. Mach. Intell. 30(9), 1534–1546 (2008)CrossRefGoogle Scholar
  7. 7.
    Lughofer, E.: On-line incremental feature weighting in evolving fuzzy classifiers. Fuzzy Sets Syst. 163(1), 1–23 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  9. 9.
    Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)CrossRefGoogle Scholar
  10. 10.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)CrossRefzbMATHGoogle Scholar
  11. 11.
    Sugiyama, M.: Local fisher discriminant analysis for supervised dimensionality reduction. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 905–912. ACM, June 2006Google Scholar
  12. 12.
    Sun, Y.: Iterative RELIEF for feature weighting: algorithms, theories, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1035–1051 (2007)CrossRefGoogle Scholar
  13. 13.
    Chen, B., Liu, H., Chai, J., Bao, Z.: Large margin feature weighting method via linear programming. IEEE Trans. Knowl. Data Eng. 21(10), 1475–1488 (2009)CrossRefGoogle Scholar
  14. 14.
    Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection-theory and algorithms. In: Proceedings of the Twenty-First International Conference on Machine Learning, p. 43. ACM, July 2004Google Scholar
  15. 15.
    Chai, J., Chen, H., Huang, L., Shang, F.: Maximum margin multiple-instance feature weighting. Pattern Recogn. 47(6), 2091–2103 (2014)CrossRefGoogle Scholar
  16. 16.
    Lichman, M.: UCI Machine Learning Repository (2013). http://archive.ics.uci.edu/ml. Irvine, C.A.: University of California, School of Information and Computer Science
  17. 17.
    Sun, Y., Todorovic, S., Goodison, S.: Local-learning-based feature selection for high-dimensional data analysis. IEEE Ttrans. Pattern Anal. Mach. Intell. 32(9), 1610–1626 (2010)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  • Gengyun Jia
    • 2
  • Haiying Zhao
    • 1
    • 2
    Email author
  • Zhigeng Pan
    • 3
  • Liangliang Wang
    • 4
  1. 1.Mobile Media and Cultural Computing Key Laboratory of BeijingCentury College, BUPTWhite Bear LakeChina
  2. 2.School of Information and Communication EngineeringBeijing University of Posts and TelecommunicationsBeijingChina
  3. 3.Digital Media and HCI Research CenterHangzhou Normal UniversityHangzhouChina
  4. 4.Xinjiang Teacher’s CollegeXinjiangChina

Personalised recommendations