Advertisement

International Journal of Parallel Programming

, Volume 46, Issue 2, pp 252–283 | Cite as

Parallel Asynchronous Strategies for the Execution of Feature Selection Algorithms

  • Jorge Silva
  • Ana Aguiar
  • Fernando SilvaEmail author
Article
  • 197 Downloads

Abstract

Reducing the dimensionality of datasets is a fundamental step in the task of building a classification model. Feature selection is the process of selecting a smaller subset of features from the original one in order to enhance the performance of the classification model. The problem is known to be NP-hard, and despite the existence of several algorithms there is not one that outperforms the others in all scenarios. Due to the complexity of the problem usually feature selection algorithms have to compromise the quality of their solutions in order to execute in a practicable amount of time. Parallel computing techniques emerge as a potential solution to tackle this problem. There are several approaches that already execute feature selection in parallel resorting to synchronous models. These are preferred due to their simplicity and capability to use with any feature selection algorithm. However, synchronous models implement pausing points during the execution flow, which decrease the parallel performance. In this paper, we discuss the challenges of executing feature selection algorithms in parallel using asynchronous models, and present a feature selection algorithm that favours these models. Furthermore, we present two strategies for an asynchronous parallel execution not only of our algorithm but of any other feature selection approach. The first strategy solves the problem using the distributed memory paradigm, while the second exploits the use of shared memory. We evaluate the parallel performance of our strategies using up to 32 cores. The results show near linear speedups for both strategies, with the shared memory strategy outperforming the distributed one. Additionally, we provide an example of adapting our strategies to execute the Sequential forward Search asynchronously. We further test this version versus a synchronous one. Our results revealed that, by using an asynchronous strategy, we are able to save an average of 7.5% of the execution time.

Keywords

Feature selection Parallel computing Machine learning Asynchronous model 

Notes

Acknowledgements

We thank the reviewers for their constructive and helpful suggestions, which helped in improving the quality of this manuscript. This work is partially funded through projects VOCE (PTDC/EEAELC/121018/2010), SMILES (NORTE-01-0145-FEDER-000020) by the ERDF through COMPETE 2020 Programme within project POCI-01-0145-FEDER-006961, and by National Funds through the FCT as part of projects UID/EEA/50014/2013 and UID/EEA/50008/2013.

References

  1. 1.
    Alba, E., Troya, J.M.: Analyzing synchronous and asynchronous parallel distributed genetic algorithms. Future Gener. Comput. Syst. 17(4), 451–465 (2001)CrossRefzbMATHGoogle Scholar
  2. 2.
    Azmandian, F., Yilmazer, A., Dy, J.G., Aslam, J., Kaeli, D.R. et al.: Gpu-accelerated feature selection for outlier detection using the local kernel density ratio. In: 2012 IEEE 12th International Conference on Data Mining (ICDM), pp. 51–60. IEEE (2012)Google Scholar
  3. 3.
    Beaumont, O., Legrand, A., Robert, Y.: The master-slave paradigm with heterogeneous processors. IEEE Trans. Parallel Distrib. Syst. 14(9), 897–908 (2003)CrossRefGoogle Scholar
  4. 4.
    Bolón-Canedo, V., Sánchez-Marono, N., Cervino-Rabunal, J.: Toward parallel feature selection from vertically partitioned data. European Symposium on Artificial Neural Networks (2014)Google Scholar
  5. 5.
    de Souza, J.T., Matwin, S., Japkowicz, N.: Parallelizing feature selection. Algorithmica 45(3), 433–456 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Domingos, P.: A few useful things to know about machine learning. Commun. ACM 55(10), 78–87 (2012)CrossRefGoogle Scholar
  7. 7.
    Donalek, C., Djorgovski, S.G., Mahabal, A., Graham, M.J., Drake, A.J., Fuchs, T.J., Turmon, M.J., Kumar, A.A., Philip, N.S., Yang, M.T.-C. et al.: Feature selection strategies for classifying high dimensional astronomical data sets. In: 2013 IEEE International Conference on Big Data, pp. 35–41. IEEE (2013)Google Scholar
  8. 8.
    Gerbessiotis, A.V., Valiant, L.G.: Direct bulk-synchronous parallel algorithms. J. Parallel Distrib. Comput. 22(2), 251–267 (1994)CrossRefGoogle Scholar
  9. 9.
    GIT mitws repository. https://github.com/JSilva90/MITWS. Accessed 14 Aug 2015
  10. 10.
    Guyon, I., Gunn, S., Ben-Hur, A., Dror, G.: Result analysis of the nips 2003 feature selection challenge. In: Advances in Neural Information Processing Systems, pp. 545–552 (2004)Google Scholar
  11. 11.
    Hao, H., Liu, C.-L., Sako, H.: Comparison of genetic algorithm and sequential search methods for classifier subset selection, p. 765. IEEE (2003)Google Scholar
  12. 12.
    Janecek, A., Gansterer, W.N., Demel, M., Ecker, G. On the relationship between feature selection and classification accuracy. In: FSDM, pp. 90–105 (2008)Google Scholar
  13. 13.
    Julião, M., Silva, J., Aguiar, A., Moniz, H., Batista, F.: Speech features for discriminating stress using branch and bound wrapper search. In: Languages, Applications and Technologies, pp. 3–14. Springer, Berlin (2015)Google Scholar
  14. 14.
    Kubicaa, J., Singhb, S., Sorokinac, D.: Parallel large-scale feature selection. Parallel Distrib. Approach. Scaling Mach. Learn (2011)Google Scholar
  15. 15.
    Li, R., Jianjiang, L., Zhang, Y., Zhao, T.: Dynamic adaboost learning with feature selection based on parallel genetic algorithm for image annotation. Knowl. Based Syst. 23, 195–201 (2010)CrossRefGoogle Scholar
  16. 16.
    Liu, H., Lei, Y.: Toward integrating feature selection algorithms for classification and clustering. IEEE Trans. Knowl. Data Eng. 17(4), 491–502 (2005)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Loughrey, J., Cunningham, P: Using early-stopping to avoid overfitting in wrapper-based feature selection employing stochastic searchGoogle Scholar
  18. 18.
    NIPS feature selection challenge. http://web.archive.org/web/20130512034606/, http://www.nipsfsc.ecs.soton.ac.uk/datasets/. Accessed 17 Oct 2017
  19. 19.
    Molina, L.C., Belanche, L., Nebot, À. Feature selection algorithms: a survey and experimental evaluation. In: 2002 IEEE International Conference on Data Mining, 2002. ICDM 2003. Proceedings, pp. 306–313. IEEE (2002)Google Scholar
  20. 20.
    Oommen, T., Misra, D., Twarakavi, N.K.C., Prakash, A., Sahoo, B., Bandopadhyay, S.: An objective analysis of support vector machine based classification for remote sensing. Math. Geosci. 40(4), 409–424 (2008)CrossRefzbMATHGoogle Scholar
  21. 21.
    Python GIL global interpreter lock. https://wiki.python.org/moin/GlobalInterpreterLock. Accessed 19 Mar 2015
  22. 22.
    Silva, J., Aguiar, A., Silva, F.: A parallel computing hybrid approach for feature selection. In: The 18th International Conference on Computational Science and Engineering, Porto, Portugal. IEEE (2015)Google Scholar
  23. 23.
    Somol, P., Pudil, P., Kittler, J.: Fast branch and bound algorithms for optimal feature selection. IEEE Trans. Pattern Anal. Mach. Intell. 26(7), 900–912 (2004)CrossRefGoogle Scholar
  24. 24.
    Stenström, P.: A survey of cache coherence schemes for multiprocessors. Computer 23(6), 12–24 (1990)CrossRefGoogle Scholar
  25. 25.
    Tang, J., Alelyani, S., Liu, H.: Feature selection for classification: a review. Data Classif. Algorithms Appl. 37–64 (2014)Google Scholar
  26. 26.
    Zhao, Z., Zhang, R., Cox, J., Duling, D., Sarle, W.: Massively parallel feature selection: an approach based on variance preservation. Mach. Learn. 92(1), 195–220 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Zhou, Y., Porwal, U., Zhang, C., Ngo, H.Q., Nguyen, L., Christopher, R., Govindaraju, V.: Parallel feature selection inspired by group testing. In: Advances in Neural Information Processing Systems, pp. 3554–3562 (2014)Google Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.CRACS/INESC TEC, Faculdade de CiênciasUniversity of PortoPortoPortugal
  2. 2.IT-Porto, Faculdade de EngenhariaUniversity of PortoPortoPortugal

Personalised recommendations