Skip to main content
Log in

Harnessing the Power of GPUs to Speed Up Feature Selection for Outlier Detection

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Acquiring a set of features that emphasize the differences between normal data points and outliers can drastically facilitate the task of identifying outliers. In our work, we present a novel non-parametric evaluation criterion for filter-based feature selection which has an eye towards the final goal of outlier detection. The proposed method seeks the subset of features that represent the inherent characteristics of the normal dataset while forcing outliers to stand out, making them more easily distinguished by outlier detection algorithms. Experimental results on real datasets show the advantage of our feature selection algorithm compared with popular and state-of-the-art methods. We also show that the proposed algorithm is able to overcome the small sample space problem and perform well on highly imbalanced datasets. Furthermore, due to the highly parallelizable nature of the feature selection, we implement the algorithm on a graphics processing unit (GPU) to gain significant speedup over the serial version. The benefits of the GPU implementation are two-fold, as its performance scales very well in terms of the number of features, as well as the number of data points.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Schölkopf B, Smola A, Müller K R. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 1998, 10(5): 1299-1319.

    Article  Google Scholar 

  2. Kira K, Rendell L A. A practical approach to feature selection. In Proc. the 9th ICML, July 1992, pp.249-256.

  3. Liu H, Motoda H. Feature Selection for Knowledge Discovery and Data Mining. Kluwer Academic Publishers, 1998.

  4. Kohavi R, John G H. Wrappers for feature subset selection. Artificial Intelligence, 1997, 97(1/2): 273-324.

    Article  MATH  Google Scholar 

  5. Dash M, Liu H. Feature selection for classification. Intelligent Data Analysis, 1997, 1(1/4): 131-156.

    Article  Google Scholar 

  6. Guyon I, Elisseeff A. An introduction to variable and feature selection. J. Machine Learning Research, 2003, 3: 1157-1182.

    MATH  Google Scholar 

  7. Nguyen H V, Gopalkrishnan V. Feature extraction for outlier detection in high-dimensional spaces. In Proc. the 4th Int. Workshop. Feature Selection in Data Mining, June 2010, pp.66-75.

  8. Azmandian F, Yilmazer A, Dy J et al. GPU-accelerated feature selection for outlier detection using the local kernel density ratio. In Proc. the 12th ICDM, December 2012, pp.51-60.

  9. Tibshirani R. Regression shrinkage and selection via the lasso. J. Royal Statistical Society, Series B, 1996, 58(1): 267-288.

    MATH  MathSciNet  Google Scholar 

  10. Song L, Bedo J, Borgwardt K M et al. Gene selection via the BAHSIC family of algorithms. Bioinformatics, 2007, 23(3): i490-i498.

    Article  Google Scholar 

  11. Wu X, Yu K, Ding W et al. Online feature selection with streaming features. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35(5): 1178-1192.

    Article  Google Scholar 

  12. Chen X W, Wasikowski M. FAST: A ROC-based feature selection metric for small samples and imbalanced data classification problems. In Proc. the 14th KDD, August 2008, pp.124-132.

  13. Aggarwal C, Yu S. An effective and efficient algorithm for high-dimensional outlier detection. The VLDB Journal, 2005, 14(2): 211-221.

    Article  Google Scholar 

  14. de Vries T, Chawla S, Houle M E. Density-preserving projections for large-scale local anomaly detection. Knowledge and Information Systems, 2012, 32(1): 25-52.

    Article  Google Scholar 

  15. Branch J W, Giannella C, Szymanski B K et al. In-network outlier detection in wireless sensor networks. Knowledge and Information Systems, 2013, 34(1): 23-54.

    Article  Google Scholar 

  16. Hido S, Tsuboi Y, Kashima H et al. Statistical outlier detection using direct density ratio estimation. Knowledge and Information Systems, 2011, 26(2): 309-336.

    Article  Google Scholar 

  17. Sugiyama M, Yamada M, von Bünau P et al. Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search. Neural Networks, 2011, 24(2): 183-198.

    Article  MATH  Google Scholar 

  18. Smola A, Song L, Teo C H. Relative novelty detection. In Proc. the 12th AISTATS, April 2009, pp.536-543.

  19. Hawkins D M. Identification of Outliers. London, New York: Chapman and Hall, 1980.

    Book  MATH  Google Scholar 

  20. Breunig M M, Kriegel H P, Ng R T et al. LOF: Identifying density-based local outliers. ACM SIGMOD Record, 2000, 29(2): 93-104.

    Article  Google Scholar 

  21. Horn R A, Johnson C R. Matrix Analysis. Cambridge, New York: Cambridge University Press, 1985.

    Book  MATH  Google Scholar 

  22. Balcan M F, Blum A. On a theory of learning with similarity functions. In Proc. the 23rd International Conference on Machine Learning (ICML), June 2006, pp.73-80.

  23. Parzen E. On estimation of a probability density function and mode. The Annals of Mathematical Statistics, 1962, 33(3): 1065-1076.

    Article  MATH  MathSciNet  Google Scholar 

  24. Rosenblatt M. Remarks on some nonparametric estimates of a density function. The Annals of Mathematical Statistics, 1956, 27(3): 832-837.

    Article  MATH  MathSciNet  Google Scholar 

  25. Devijver P A, Kittler J. Pattern Recognition: A Statistical Approach. Englewood Cliffs, NJ: Prentice Hall, 1982.

    MATH  Google Scholar 

  26. Masaeli M, Fung G, Dy J G. From transformation-based dimensionality reduction to feature selection. In Proc. the 27th ICML, June 2010, pp.751-758.

  27. Kirk D B, Hwu W W. Programming Massively Parallel Processors: A Hands-on Approach (Applications of GPU Computing Series). Morgan Kaufmann Publishers, 2010.

  28. Lv Q, Josephson W, Wang Z et al. Multi-probe LSH: Efficient indexing for high-dimensional similarity search. In Proc. the 33rd VLDB, Sept. 2007, pp.950-961.

  29. Arya S, Mount D M, Netanyahu N S et al. An optimal algorithm for approximate nearest neighbor searching fixed dimensions. Journal of the ACM, 1998, 45(6): 891-923.

    Article  MATH  MathSciNet  Google Scholar 

  30. Garcia V, Debreuve E, Barlaud M. Fast k nearest neighbor search using GPU. In Proc. Workshop on Computer Vision and Pattern Recognition, June 2008.

  31. Azmandian F, Dy J G, Aslam J A et al. Local kernel density ratio-based feature selection for outlier detection. In Proc. the 4th Asian Conf. Machine Learning, Nov. 2012, pp.49-64.

  32. Güvenir H A, Acar B, Demiröz G et al. A supervised machine learning algorithm for arrhythmia analysis. In Proc. Computers in Cardiology Conference, September 1998, pp.433-436.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fatemeh Azmandian.

Additional information

A preliminary version of the paper was published in the Proceedings of ICDM 2012.

Electronic supplementary material

Below is the link to the electronic supplementary material.

ESM 1

(PDF 75 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Azmandian, F., Yilmazer, A., Dy, J.G. et al. Harnessing the Power of GPUs to Speed Up Feature Selection for Outlier Detection. J. Comput. Sci. Technol. 29, 408–422 (2014). https://doi.org/10.1007/s11390-014-1439-4

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-014-1439-4

Keywords

Navigation