High Dimensionality Characteristics and New Fuzzy Versatile Particle Swarm Optimization

Conference paper
Part of the Lecture Notes in Networks and Systems book series (LNNS, volume 10)

Abstract

Technological developments have reshaped the scientific thinking, since observation from experiments and real world are massive. Each experiment is able to produce information about the huge number of variables (High dimensional). Unique characteristics of high dimensionality impose various challenges to the traditional learning methods. This paper presents problem produced by high dimensionality and proposes new fuzzy versatile binary PSO (FVBPSO) method. Experimental results show the curse of dimensionality and merits of proposed method on bench marking datasets.

Keywords

High dimensionality Particle swarm optimization Fuzzy logic Feature selection Classification 

Notes

Acknowledgements

This work is supported by the SRF grant from Council of Scientific and Industrial Research (CSIR), India, SRF grant (09/1144(0001)2015EMR-I) and Department of Computer Science, Central University of South Bihar.

References

  1. 1.
    Fan, J., Han, F., and Liu, H.: Challenges of Big Data Analysis. Natl Sci Rev 1, 293–314 (2014) doi: 10.1093/nsr/nwt032.
  2. 2.
    Fan, J., and Fan, Y.: High dimensional classification using feature annealed independence rules. Ann. Stat. 36(6), 2605–2637 (2008).Google Scholar
  3. 3.
    Fan, J., Guo, S. and Hao, N.: Variance estimation using refitted cross validation in ultrahigh dimensional regression. J R Stat Soc Series B Stat Methodology, vol. 74, no. 1, pp. 37–65 (2012).Google Scholar
  4. 4.
    Liao Y. and Jiang, W.: Posterior consistency of nonparametric conditional moment restricted modeld. Ann. Stat. 39(6), 3003–3031 (2011).Google Scholar
  5. 5.
    Fan, J. and Liao, Y.: Endogeneity in ultrahigh dimension. Ann. Stat. 42(3), 872–917 (2014).Google Scholar
  6. 6.
    Hall P., Pittelkow, Y. and Ghosh, M.: Theoretical Measure of relative performance of classifiers for high dimensional data with small sample size. J R Stat Soc Series B Stat Methodology, vol. 70, no. 1, pp. 159–173 (2008).Google Scholar
  7. 7.
    Bellmann R.: Adaptive control processes: A Guided Tour. Triceton University Press (1961).Google Scholar
  8. 8.
    Silverman B.: Density estimation for statistics and data analysis. Chapman and Hall, (1986).Google Scholar
  9. 9.
    Donoho, D.: High-Dimensional Data Analysis: The Curse and Blessings of dimensionality. AMS Math Challenges Lecture, pp. 1–32 (2000).Google Scholar
  10. 10.
    Scott, D. W. and Thompson, J. R.: Probability density estimation in higher dimensions. In: Computer Science and Statistics: Proceedings of the Fifteenth Symposium on the Interface, J.E. Gentle Eds. Amsterdam, New York (1983).Google Scholar
  11. 11.
    Agarwal, S. and Rajesh, R: Difficulty in Handling High Dimensional Data-Please stand UP. JRET: International Journal of Research in Engineering and Technology. 3(14), 101–104 (2014).Google Scholar
  12. 12.
    Kennedy, J. and Eberhart, R. C.: Particle swarm optimization. In: IEEE International Conference on Neural Networks, Perth, Australia, pp. 942–1948 (1995).Google Scholar
  13. 13.
    Shi, Y. and Eberhart, R. C.: A modified particle swam optimizer, IEEE Word Congress on Computational Intelligence, pp. 69–73 (1998).Google Scholar
  14. 14.
    Kennedy, J., Eberhart, R. C. and Shi, Y.: Swarm Intelligence. Morgan Kaufman, San Mateo, CA (2001).Google Scholar
  15. 15.
    J. Kennedy, Eberhart R. C. and Y. Shi, Computational Intelligence, Morgan Kaufman, San Mateo, CA (2007).Google Scholar
  16. 16.
    J. Kennedy and Eberhart R. C.: A discrete binary version of the particle swarm algorithm. In: IEEE conference on systems, man, and cyber, vol. 5, pp. 4104–4108 (1997).Google Scholar
  17. 17.
    Chuang, L. Y., Chang, H. C., Tu, J. and Yang, C. H.: Improved binary PSO for feature selection using gene expression data. Comp. Bio. Chem. 32, 29–38 (2008).Google Scholar
  18. 18.
    Rajesh, R and Agarwal, S: Some Modification in Particle Swarm Optimization. The 18th Online World Conference on Soft Computing in Industrial Applications (2014).Google Scholar
  19. 19.
    Agarwal, S. and Rajesh. R.: Enhanced Velocity BPSO and Convergence Analysis on Dimensionality Reduction. In proceedings of Recent Advances In Mathematics, Statistics and Computer Science (2015) in press.Google Scholar
  20. 20.
    Emami, H. and Derakhshan, F.: Integrating Fuzzy K-Means, Particle Swarm Optimization, and Imperialist Competitive Algorithm for Data Clustering. Arab Journal Sci. Eng. 40, 3545–3554 (2015).Google Scholar
  21. 21.
    Abdelbar, A. M., Abdelbar, S., Wunsch, D. C.: Fuzzy PSO: A Generalization of Particle Swarm Optimization. In: Proc. Intl. Joint Conf. on Neural Networks, Montrael, Canada, pp. 1086–1091 (2005).Google Scholar
  22. 22.
    Chuang, L. Y., Tsai, S. W., Yang, C. H.: Fuzzy adaptive catfish particle swarm optimization. J. Artif. Intell. Res. 1, 149–170 (2012).Google Scholar
  23. 23.
    Chai, R., Ling, S. H., Hunter, G. P., Tran, Y., Nguyen, H. T.: Brain Computer Interface Classifier for Wheelchair Commands Using Neural Network With Fuzzy Particle Swarm Optimization. IEEE J. Biomed. Health Inform. 18, 1614–1624 (2014).Google Scholar
  24. 24.
    Soeprijanto, A., Abdillah, M.: Type 2 fuzzy adaptive binary particle swarm optimization for optimal placement and sizing of distributed generation. Proc. of the 2nd Intl Conf. on Instrumentation, Communications, Information Technology, and Biomedical Engineering (ICICIBME), IEEE, pp. 233–238 (2011).Google Scholar
  25. 25.
    Olivas, F., Valdez, F. and Castillo, O.: Fuzzy Classification System Design Using PSO with Dynamic Parameter Adaptation Through Fuzzy Logic. Fuzzy Logic Augmentation of Nature -Inspired Optimization Met heuristics, Studies in Computational Intelligence, Springer. 574, 29–47 (2015).Google Scholar
  26. 26.
    Lichman, M.: UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science.
  27. 27.
    John, G. H.: Pat Langley, Estimating Continuos Distribution in Bayesian. In: Proc. of the eleventh conf. on uncertainty in artificial intelligence, Morgan Kaufman Publisher, San Mateo (1995).Google Scholar
  28. 28.
    Freund, Y., Schapire, R. E.: Experiment with a New Boosting Algorithm Machine Learning. In: Proc. of the Thirteenth Int. Conf. (1996).Google Scholar
  29. 29.
    Aha, D. W., Kibler, D., Albert, M. K.: Instance based Learning. Machine Learning. 6, 37–66 (1991).Google Scholar
  30. 30.
    Cleary, J. G., Trigg, L. E.: K*: An Instance-based Learner Using an Entropic. In: Proc. of the 12th Int. Conf. on Machine Learning Distance Measure (1995).Google Scholar
  31. 31.
    Atkesonm, C. G., Moore, A. W. and Schaal, S.: Locally Weighted Learning. College of Computing, Georgia Institute of Technology (1996).Google Scholar
  32. 32.
    Quinlan, J. R.: C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., San Francisco, CA (1993).Google Scholar
  33. 33.
    Breiman, L.: Random Forest. Machine Learning. 45, 5–32 (2001).Google Scholar
  34. 34.
    Statnikov, A.: Gene expression model selector. www.gems-system.org (2005).

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  1. 1.Department of Computer ScienceCentral University of South BiharPatnaIndia

Personalised recommendations