PSO and Statistical Clustering for Feature Selection: A New Representation

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8886)


Classification tasks often involve a large number of features, where irrelevant or redundant features may reduce the classification performance. Such tasks typically requires a feature selection process to choose a small subset of relevant features for classification. This paper proposes a new representation in particle swarm optimisation (PSO) to utilise statistical clustering information to solve feature selection problems. The proposed algorithm is examined and compared with two conventional feature selection algorithms and two existing PSO based algorithms on eight benchmark datasets of varying difficulty. The experimental results show that the proposed algorithm can be successfully used for feature selection to considerably reduce the number of features and achieve similar or significantly higher classification accuracy than using all features. It achieves significantly better classification accuracy than one conventional method although the number of features is larger. Compared with the other conventional method and the two PSO methods, the proposed algorithm achieves better performance in terms of both the classification performance and the number of features.


Particle swarm optimisation Feature selection Classification Representation 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Al-Ani, A., Alsukker, A., Khushaba, R.N.: Feature subset selection using differential evolution and a wheel based search strategy. Swarm and Evolutionary Computation 9, 15–26 (2013)CrossRefGoogle Scholar
  2. 2.
    Asuncion, A., Newman, D.: Uci machine learning repository (2007)Google Scholar
  3. 3.
    Bach, F.R., Jordan, M.I.: A probabilistic interpretation of canonical correlation analysis. Tech. rep. (2005)Google Scholar
  4. 4.
    Cervante, L., Xue, B., Shang, L., Zhang, M.: Binary particle swarm optimisation and rough set theory for dimension reduction in classification. In: 2013 IEEE Congress on Evolutionary Computation (CEC), pp. 2428–2435 (2013)Google Scholar
  5. 5.
    Cervante, L., Xue, B., Shang, L., Zhang, M.: A multi-objective feature selection approach based on binary pso and rough set theory. In: Middendorf, M., Blum, C. (eds.) EvoCOP 2013. LNCS, vol. 7832, pp. 25–36. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  6. 6.
    Cervante, L., Xue, B., Zhang, M., Shang, L.: Binary particle swarm optimisation for feature selection: A filter based approach. In: IEEE Congress on Evolutionary Computation (CEC 2012), pp. 881–888 (2012)Google Scholar
  7. 7.
    Chuang, L.Y., Tsai, S.W., Yang, C.H.: Improved binary particle swarm optimization using catfish effect for feature selection. Expert Systems with Applications 38, 12699–12707 (2011)CrossRefGoogle Scholar
  8. 8.
    Dash, M., Liu, H.: Feature selection for classification. Intelligent Data Analysis 1(4), 131–156 (1997)CrossRefGoogle Scholar
  9. 9.
    Engelbrecht, A.P.: Computational intelligence: an introduction, 2nd edn. Wiley (2007)Google Scholar
  10. 10.
    Fdhila, R., Hamdani, T.M., Alimi, A.M.: Distributed mopso with a new population subdivision technique for the feature selection. In: International Symposium on Computational Intelligence and Intelligent Informatics (ISCIII 2011), pp. 81–86 (2011)Google Scholar
  11. 11.
    Gutlein, M., Frank, E., Hall, M., Karwath, A.: Large-scale attribute selection using wrappers. In: IEEE Symposium on Computational Intelligence and Data Mining (CIDM 2009), pp. 332–339. IEEE (2009)Google Scholar
  12. 12.
    Kanan, H.R., Faez, K.: An improved feature selection method based on ant colony optimization (aco) evaluated on face recognition system. Applied Mathematics and Computation 205(2), 716–725 (2008)zbMATHCrossRefGoogle Scholar
  13. 13.
    Kennedy, J., Eberhart, R.: Particle swarm optimization. In: IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948 (1995)Google Scholar
  14. 14.
    Marill, T., Green, D.: On the effectiveness of receptors in recognition systems. IEEE Transactions on Information Theory 9(1), 11–17 (1963)CrossRefGoogle Scholar
  15. 15.
    Matechou, E., Liu, I., Pledger, S., Arnold, R.: Biclustering models for ordinal data. Presentation at the NZ Statistical Assn. Annual Conference, University of Auckland (2011)Google Scholar
  16. 16.
    Neshatian, K., Zhang, M.: Genetic programming for feature subset ranking in binary classification problems. In: Vanneschi, L., Gustafson, S., Moraglio, A., De Falco, I., Ebner, M. (eds.) EuroGP 2009. LNCS, vol. 5481, pp. 121–132. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  17. 17.
    Pledger, S., Arnold, R.: Multivariate methods using mixtures: correspondence analysis, scaling and pattern detection. Computational Statistics and Data Analysis (2013),
  18. 18.
    Pudil, P., Novovicova, J., Kittler, J.V.: Floating search methods in feature selection. Pattern Recognition Letters 15(11), 1119–1125 (1994)CrossRefGoogle Scholar
  19. 19.
    Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: IEEE International Conference on Evolutionary Computation (CEC 1998), pp. 69–73 (1998)Google Scholar
  20. 20.
    Stearns, S.: On selecting features for pattern classifier. In: Proceedings of the 3rd International Conference on Pattern Recognition, pp. 71–75. IEEE Press, Coronado (1976)Google Scholar
  21. 21.
    Van Den Bergh, F.: An analysis of particle swarm optimizers. Ph.D. thesis, University of Pretoria (2006)Google Scholar
  22. 22.
    Whitney, A.: A direct method of nonparametric measurement selection. IEEE Transactions on Computers C-20(9), 1100–1103 (1971)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann (2005)Google Scholar
  24. 24.
    Xue, B., Cervante, L., Shang, L., Browne, W.N., Zhang, M.: A multi-objective particle swarm optimisation for filter based feature selection in classification problems. Connection Science 24(2-3), 91–116 (2012)CrossRefGoogle Scholar
  25. 25.
    Xue, B., Cervante, L., Shang, L., Browne, W.N., Zhang, M.: Binary PSO and rough set theory for feature selection: A multi-objective filter based approach. International Journal of Computational Intelligence and Applications 13(02), 1450009 (2014)CrossRefGoogle Scholar
  26. 26.
    Xue, B., Zhang, M., Browne, W.N.: Multi-objective particle swarm optimisation (PSO) for feature selection. In: Genetic and Evolutionary Computation Conference (GECCO 2012), Philadelphia, PA, USA, pp. 81–88. ACM (2012)Google Scholar
  27. 27.
    Xue, B., Zhang, M., Browne, W.N.: New fitness functions in binary particle swarm optimisation for feature selection. In: IEEE CEC 2012, pp. 2145–2152 (2012)Google Scholar
  28. 28.
    Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE Transactions on Cybernetics 43(6), 1656–1671 (2013)CrossRefGoogle Scholar
  29. 29.
    Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms. Applied Soft Computing 18, 261–276 (2014)CrossRefGoogle Scholar
  30. 30.
    Yang, C.S., Chuang, L.Y., Li, J.C.: Chaotic maps in binary particle swarm optimization for feature selection. In: IEEE Conference on Soft Computing in Industrial Applications (SMCIA 2008), pp. 107–112 (2008)Google Scholar
  31. 31.
    Zhu, Z.X., Ong, Y.S., Dash, M.: Wrapper-filter feature selection algorithm using a memetic framework. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 37(1), 70–76 (2007)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.School of Engineering and Computer ScienceVictoria University of WellingtonWellingtonNew Zealand
  2. 2.School of Mathematics, Statistics and Operations ResearchVictoria University of WellingtonWellingtonNew Zealand

Personalised recommendations