Advertisement

Comparison of Methods for Automated Feature Selection Using a Self-organising Map

  • Aliyu Usman AhmadEmail author
  • Andrew Starkey
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 629)

Abstract

The effective modelling of high-dimensional data with hundreds to thousands of features remains a challenging task in the field of machine learning. One of the key challenges is the implementation of effective methods for selecting a set of relevant features, which are buried in high-dimensional data along with irrelevant noisy features by choosing a subset of the complete set of input features that predicts the output with higher accuracy comparable to the performance of the complete input set. Kohonen’s Self Organising Neural Network MAP has been utilized in various ways for this task. In this work, a review of the appropriate application of multiple methods for this task is carried out. The feature selection approach based on analysis of the Self Organising network result after training is presented with comparison of performance of two methods.

Keywords

Clustering Self-organising neural network MAP Feature selection Engineering optimisation 

References

  1. 1.
    Tajunisha, S., Saravanan, V.: Performance analysis of k-means with different initialization methods for high dimensional data. Int. J. Artif. Intell. Appl. (IJAIA) 1, 44–52 (2010)Google Scholar
  2. 2.
    Yin, S., Huang, Z.: Performance monitoring for vehicle suspension system via fuzzy positivistic C-means clustering based on accelerometer measurements (2015)Google Scholar
  3. 3.
    Gebauer, H.: Identifying service strategies in product manufacturing companies by exploring environment–strategy configurations. Ind. Mark. Manag. 37, 278–291 (2008)CrossRefGoogle Scholar
  4. 4.
    Ivanov, V.: A review of fuzzy methods in automotive engineering applications. Eur. Transp. Res. Rev. 7, 1–10 (2015)CrossRefGoogle Scholar
  5. 5.
    Krishnan, V., Ulrich, K.T.: Product development decisions: a review of the literature. Manag. Sci. 47, 1–21 (2001)CrossRefGoogle Scholar
  6. 6.
    Liu, H., Li, Y., Li, N., Liu, C.: Robust visual monitoring of machine condition with sparse coding and self-organizing map. In: Liu, H., Ding, H., Xiong, Z., Zhu, X. (eds.) ICIRA 2010, Part I. LNCS, vol. 6424, pp. 642–653. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  7. 7.
    Pham, D., Afify, A.: Clustering techniques and their applications in engineering. Proc. Inst. Mech. Eng. Part C 221, 1445–1459 (2007)CrossRefGoogle Scholar
  8. 8.
    Khoshnevisan, B., Bolandnazar, E., Barak, S., Shamshirband, S., Maghsoudlou, H., Altameem, T.A., Gani, A.: A clustering model based on an evolutionary algorithm for better energy use in crop production. Stoch. Environ. Res. Risk Assess. 29, 1921–1935 (2015)CrossRefGoogle Scholar
  9. 9.
    Nguyen, S.D., Nguyen, Q.H., Choi, S.: A hybrid clustering based fuzzy structure for vibration control–Part 2: an application to semi-active vehicle seat-suspension system. Mech. Syst. Sig. Process. 56, 288–301 (2015)CrossRefGoogle Scholar
  10. 10.
    Maren, A.J., Harston, C.T., Pap, R.M.: Handbook of Neural Computing Applications. Academic Press, Cambridge (2014)zbMATHGoogle Scholar
  11. 11.
    Kohonen, T.: The self-organizing map. Proc. IEEE 78, 1464–1480 (1990)CrossRefGoogle Scholar
  12. 12.
    Yin, H.: The self-organizing maps: background, theories, extensions and applications. In: Fulcher, H., Jain, L.C. (eds.) Computational Intelligence: A Compendium. SCI, vol. 115, pp. 642–762. Springer, Heidelberg (2008)Google Scholar
  13. 13.
    Simula, O., Alhoniemi, E., Hollmen, J., Vesanto, J.: Monitoring and modeling of complex processes using hierarchical self-organizing maps (1996)Google Scholar
  14. 14.
    Shafreen Banu, A., Ganesh, S.H.: A study of feature selection approaches for classification, pp. 1–4 (2015)Google Scholar
  15. 15.
    De Carvalho, F.D.A., Bertrand, P., Simões, E.C.: Batch SOM algorithms for interval-valued data with automatic weighting of the variables. Neurocomputing 182, 66–81 (2015)CrossRefGoogle Scholar
  16. 16.
    Mesghouni, N., Temanni, M.: Unsupervised double local weighting for feature selection, vol. 1, pp. 413–417 (2011)Google Scholar
  17. 17.
    Grozavu, N., Bennani, Y., Lebbah, M.: From variable weighting to cluster characterization in topographic unsupervised learning, pp. 1005–1010 (2009)Google Scholar
  18. 18.
    De Bodt, E., Cottrell, M., Verleysen, M.: Statistical tools to assess the reliability of self-organizing maps. Neural Netw. 15, 967–978 (2002)CrossRefzbMATHGoogle Scholar
  19. 19.
    Gonzaga, C.C., Schneider, R.M.: On the steepest descent algorithm for quadratic functions. 1–20 (2015)Google Scholar
  20. 20.
    Schwefel, H.: Numerische optimierung von computer-modellen mittels der evolutions strategie. Birkhäuser, Basel Switzerland (1977)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.School of EngineeringUniversity of AberdeenAberdeenUK

Personalised recommendations