Advertisement

Background

  • Muhammad Summair RazaEmail author
  • Usman Qamar
Chapter
  • 68 Downloads

Abstract

To overcome the phenomenon of curse of dimensionality, one of the methods is to reduce dimensions without effecting relevant information present in entire dataset. There are various techniques proposed in the literature to reduce dimensions. In this chapter, we have presented an overview of these techniques.

References

  1. 1.
    Villars RL, Olofson CW, Eastwood M (2011) Big data: what it is and why you should care. IDC, White Paper, p 14Google Scholar
  2. 2.
    Asuncion A, Newman D (2007) UCI machine learning repositoryGoogle Scholar
  3. 3.
    Bellman R (1956) Dynamic programming and lagrange multipliers. Proc Natl Acad Sci 42(10):767–769MathSciNetzbMATHCrossRefGoogle Scholar
  4. 4.
    Yan J et al (2006) Effective and efficient dimensionality reduction for large-scale and streaming data preprocessing. IEEE Trans Knowl Data Eng 18(3):320–333Google Scholar
  5. 5.
    Han Y et al Semisupervised feature selection via spline regression for video semantic recognition. IEEE Trans Neural Netw Learn Syst 26(2):252–264 (2015)Google Scholar
  6. 6.
    Boutsidis C et al. (2015) Randomized dimensionality reduction for $k$-means clustering. IEEE Trans Inf Theory 61(2):1045–1062Google Scholar
  7. 7.
    Cohen, MB et al (2015) Dimensionality reduction for k-means clustering and low rank approximation. In: Proceedings of the forty-seventh annual ACM on symposium on theory of computing. ACMGoogle Scholar
  8. 8.
    Bourgain J, Dirksen S, Nelson J (2015) Toward a unified theory of sparse dimensionality reduction in euclidean space. Geom Funct Anal 25(4):1009–1088MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    Radenović F, Jégou H, Chum O (2015) Multiple measurements and joint dimensionality reduction for large scale image search with short vectors. In: Proceedings of the 5th ACM on international conference on multimedia retrieval. ACMGoogle Scholar
  10. 10.
    Azar AT, Hassanien AE (2015) Dimensionality reduction of medical big data using neural-fuzzy classifier. Soft Comput 19(4):1115–1127CrossRefGoogle Scholar
  11. 11.
    Pawlak Z (1991) Rough sets: theoretical aspects about data. Springer, ChamzbMATHCrossRefGoogle Scholar
  12. 12.
    Qian Y et al (2015) Fuzzy-rough feature selection accelerator. Fuzzy Sets Syst 258:61–78MathSciNetzbMATHCrossRefGoogle Scholar
  13. 13.
    Tan A et al (2015) Matrix-based set approximations and reductions in covering decision information systems. Int J Approx Reason 59:68–80MathSciNetzbMATHCrossRefGoogle Scholar
  14. 14.
    Al Daoud E (2015) An efficient algorithm for finding a fuzzy rough set reduct using an improved harmony search. Int J Modern Educ Comput Sci 7(2):16CrossRefGoogle Scholar
  15. 15.
    Candès EJ et al (2011) Robust principal component analysis? J ACM (JACM) 58(3), 11MathSciNetzbMATHCrossRefGoogle Scholar
  16. 16.
    Kao Y-H, Benjamin Van R (2013) Learning a factor model via regularized PCA. Mach Learn 91(3):279–303MathSciNetzbMATHCrossRefGoogle Scholar
  17. 17.
    Varshney KR, Willsky AS (2011) Linear dimensionality reduction for margin-based classification: high-dimensional data and sensor networks. IEEE Trans Signal Process 59(6):2496–2512MathSciNetzbMATHCrossRefGoogle Scholar
  18. 18.
    Van Der Maaten L, Postma E, Van den Herik J (2009) Dimensionality reduction: a comparative. J Mach Learn Res 10:66–71Google Scholar
  19. 19.
    Jensen R (2005) Combining rough and fuzzy sets for feature selection. Dissertation University of EdinburghGoogle Scholar
  20. 20.
    Cunningham P (2008) Dimension reduction. In: Machine learning techniques for multimedia, pp 91–112. Springer, BerlinGoogle Scholar
  21. 21.
    Friedman JH, Stuetzle W (1981) Projection pursuit regression. J Am Stat Assoc 76(376):817–823MathSciNetCrossRefGoogle Scholar
  22. 22.
    Borg I, Patrick JF (2005) Modern multidimensional scaling: theory and applications. Springer Science & Business Media, New YorkzbMATHGoogle Scholar
  23. 23.
    Dalgaard, Peter. Introductory statistics with R. Springer Science & Business Media, 2008Google Scholar
  24. 24.
    Gisbrecht A, Schulz A, Hammer B (2015) Parametric nonlinear dimensionality reduction using kernel t-SNE. Neurocomputing 147:71–82CrossRefGoogle Scholar
  25. 25.
    Gottlieb L-A, Krauthgamer R (2015) A nonlinear approach to dimension reduction. Discrete Comput Geometry 54(2):291–315MathSciNetzbMATHCrossRefGoogle Scholar
  26. 26.
    Gisbrecht A, Hammer B (2015) Data visualization by nonlinear dimensionality reduction. Wiley Interdiscip Rev Data Mining Knowl Discov 5(2):51–73CrossRefGoogle Scholar
  27. 27.
    Zeng X, Luo S (2008) Generalized locally linear embedding based on local reconstruction similarity. In: 2008 Fifth international conference on fuzzy systems and knowledge discovery, FSKD’08, vol 5. IEEEGoogle Scholar
  28. 28.
    Saul LK et al (2006) Spectral methods for dimensionality reduction. Semisupervised Learn 293–308Google Scholar
  29. 29.
    Liu R et al (2008) Semi-supervised learning by locally linear embedding in kernel space. In: 2008 19th international conference on pattern recognition, ICPR 2008. IEEEGoogle Scholar
  30. 30.
    Gerber S, Tasdizen T, Whitaker R (2007) Robust non-linear dimensionality reduction using successive 1-dimensional Laplacian eigenmaps. In: Proceedings of the 24th international conference on machine learning. ACMGoogle Scholar
  31. 31.
    Teng L et al (2005) Dimension reduction of microarray data based on local tangent space alignment. In: 2005 fourth IEEE conference on cognitive informatics (ICCI 2005). IEEEGoogle Scholar
  32. 32.
  33. 33.
    Balasubramanian M, Schwartz EL (2002) The isomap algorithm and topological stability. Science 295(5552):7–7CrossRefGoogle Scholar
  34. 34.
    Faraway JJ (2005) Extending the linear model with r (texts in statistical science)Google Scholar
  35. 35.
    Jensen R, Shen Q (2008) Computational intelligence and feature selection: rough and fuzzy approaches, vol 8. WileyGoogle Scholar
  36. 36.
    Cunningham P (2008) Dimension reduction: machine learning techniques for multimedia, pp 91–112. Springer, BerlinGoogle Scholar
  37. 37.
    Tang B, Kay S, He H (2016) Toward optimal feature selection in naive Bayes for text categorization. IEEE Trans Knowl Data Eng 28(9):2508–2521CrossRefGoogle Scholar
  38. 38.
    Jiang F, Sui Y, Zhou L (2015) A relative decision entropy-based feature selection approach. Pattern Recogn 48(7):2151–2163zbMATHCrossRefGoogle Scholar
  39. 39.
    Singh DA et al (2016) Feature selection using rough set for improving the performance of the supervised learner. Int J Adv Sci Technol 87:1–8CrossRefGoogle Scholar
  40. 40.
    Xu J et al (2013) L1 graph based on sparse coding for feature selection. In: International symposium on neural networks. Springer, BerlinCrossRefGoogle Scholar
  41. 41.
    Almuallim H, Dietterich TG (1991) Learning with many irrelevant features. In: AAAI, vol 91Google Scholar
  42. 42.
    Kira K, Rendell LA (1992) The feature selection problem: traditional methods and a new algorithm. In: AAAI, vol 2Google Scholar
  43. 43.
    Raman B, Ioerger TR (2002) Instance-based filter for feature selection. J Mach Learn Res 1(3):1–23Google Scholar
  44. 44.
    Liu H, Motoda H (2007) (eds) Computational methods of feature selection. CRC PressGoogle Scholar
  45. 45.
    Du L, Shen Y-D (2015) Unsupervised feature selection with adaptive structure learning. In: 2015 Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining. ACMGoogle Scholar
  46. 46.
    Li J et al (2015) Unsupervised streaming feature selection in social media. In: Proceedings of the 24th ACM international on conference on information and knowledge management. ACMGoogle Scholar
  47. 47.
    Singh DA, Balamurugan SA, Leavline EJ (2015) An unsupervised feature selection algorithm with feature ranking for maximizing performance of the classifiers. Int J Autom Comput 12(5):511–517CrossRefGoogle Scholar
  48. 48.
    He X, Cai D, Niyogi P (2005) Laplacian score for feature selection. In: NIPS, vol 186Google Scholar
  49. 49.
    Devaney M, Ram A (1997) Efficient feature selection in conceptual clustering. In: ICML, vol 97Google Scholar
  50. 50.
    Gluck M (1985) Information, uncertainty and the utility of categories. In: Proceedings of the seventh annual conference on cognitive science society. Lawrence ErlbaumGoogle Scholar
  51. 51.
    Yang J, Hua X, Jia P (2013) Effective search for genetic-based machine learning systems via estimation of distribution algorithms and embedded feature reduction techniques. Neurocomputing 113:105–121CrossRefGoogle Scholar
  52. 52.
    Imani MB, Keyvanpour MR, Azmi R (2013) A novel embedded feature selection method: a comparative study in the application of text categorization. Appl Artif Intell 27(5):408–427CrossRefGoogle Scholar
  53. 53.
    Viola M et al (2015) A generalized eigenvalues classifier with embedded feature selection. Optim Lett 1–13Google Scholar
  54. 54.
    Xiao Z et al (2008) ESFS: a new embedded feature selection method based on SFS. Rapports de recherchéGoogle Scholar
  55. 55.
    Hall MA (2000) Correlation-based feature selection of discrete and numeric class machine learningGoogle Scholar
  56. 56.
    Yu L, Liu H (2003) Feature selection for high-dimensional data: a fast correlation-based filter solution. In: ICML, vol 3Google Scholar
  57. 57.
    Jiang S-y, Wang L-x (2016) Efficient feature selection based on correlation measure between continuous and discrete features. Inf Process Lett 116(2):203–215MathSciNetzbMATHCrossRefGoogle Scholar
  58. 58.
    Hoque N, Bhattacharyya DK, Kalita JK (2014) MIFS-ND: a mutual information-based feature selection method. Expert Syst Appl 41(14):6371–6385CrossRefGoogle Scholar
  59. 59.
    Hancer E et al (2015) A multi-objective artificial bee colony approach to feature selection using fuzzy mutual information. In: 2015 IEEE congress on evolutionary computation (CEC). IEEEGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.Department of Computer and Software Engineering, College of Electrical and Mechanical EngineeringNational University of Sciences and Technology (NUST)IslamabadPakistan

Personalised recommendations