Skip to main content

Combining ELM with Random Projections for Low and High Dimensional Data Classification and Clustering

  • Conference paper
  • First Online:
Proceedings of the Fifth International Conference on Fuzzy and Neuro Computing (FANCCO - 2015)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 415))

Abstract

Extreme learning machine (ELM), as a new learning method for training feedforward neural networks, has shown its good generalization performance in regression and classification applications. Random projection (RP), as a simple and powerful technique for dimensionality reduction, is used for projecting high-dimensional data into low-dimensional subspaces while ensuring that the distances between data points are approximately preserved. This paper presents a systematic study on the application of RP in conjunction with ELM for both low- and high-dimensional data classification and clustering.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, vol. 2, pp. 985–990 (2004)

    Google Scholar 

  2. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70, 489–501 (2006)

    Article  Google Scholar 

  3. Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Syst. 2(3), 321–355 (1988)

    MathSciNet  MATH  Google Scholar 

  4. Schmidt, W.F., Kraaijveld, M.A., Duin, R.P.W.: Feedforward neural networks with random weights. In: Proceedings of 11th IAPR International Conference on Pattern Recognition Methodology and Systems, Hague, Netherlands, pp. 1–4 (1992)

    Google Scholar 

  5. Pao, Y.H., Park, G.H., Sobajic, D.J.: Learning and generalization characteristics of random vector functional-link net. Neurocomputing 6, 163–180 (1994)

    Article  Google Scholar 

  6. Huang, G.B., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. B Cybern. 42, 513–529 (2012)

    Article  Google Scholar 

  7. Frénay, B., Verleysen, M.: Using SVMs with randomised feature spaces: an extreme learning approach. In: Proceedings of 18th European Symposium on Artificial Neural Networks (ESANN), Bruges, Belgium, pp. 315–320 (2010)

    Google Scholar 

  8. He, Q., Jin, X., Du, C., Zhuang, F., Shi, Z.: Clustering in extreme learning machine feature space. Neurocomputing 128, 88–95 (2014)

    Article  Google Scholar 

  9. Alshamiri, A.K., Singh, A., Surampudi, B.R.: A novel ELM K-means algorithm for clustering. In: Proceedings of 5th International Conference on Swarm, Evolutionary and Memetic Computing (SEMCCO), Bhubaneswar, India, pp. 212–222 (2014)

    Google Scholar 

  10. Alshamiri, A.K., Singh, A., Surampudi, B.R.: Artificial bee colony algorithm for clustering: an extreme learning approach. Soft Comput. (2015)

    Google Scholar 

  11. Fradkin, D., Madigan, D.: Experiments with random projections for machine learning. In: Proceedings of the 9th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 517–522 (2003)

    Google Scholar 

  12. Deegalla, S., Bostrom, H.: Reducing high-dimensional data by principal component analysis vs. random projection for nearest neighbor classification. In: Proceedings of the 5th International Conference on Machine Learning and Applications (ICMLA), Orlando, Fl, pp. 245–250 (2006)

    Google Scholar 

  13. Fern, X.Z., Brodley, C.E.: Random projection for high dimensional data clustering: a cluster ensemble approach. In: Proceedings of the 20th International Conference of Machine Learning (ICML), Washington DC, pp. 186–193 (2003)

    Google Scholar 

  14. Cardoso, A., Wichert, A.: Iterative random projections for high-dimensional data clustering. Pattern Recogn. Lett. 33, 1749–1755 (2012)

    Article  Google Scholar 

  15. Gastaldo, P., Zunino, R., Cambria, E., Decherchi, S.: Combining ELM with random projections. IEEE Intell. Syst. 28(6), 18–20 (2013)

    Google Scholar 

  16. Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks 17(4), 879–892 (2006)

    Article  Google Scholar 

  17. Huang, G.B.: Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Trans. Neural Networks 14(2), 274–281 (2003)

    Article  Google Scholar 

  18. Serre, D.: Matrices: Theory and Applications. Springer, NewYork (2002)

    Google Scholar 

  19. Huang, G.B., Ding, X., Zhou, H.: Optimization method based extreme learning machine for classification. Neurocomputing 74, 155–163 (2010)

    Article  Google Scholar 

  20. Lan, Y., Soh, Y.C., Huang, G.B.: Constructive hidden nodes selection of extreme learning machine for regression. Neurocomputing 73, 3191–3199 (2010)

    Article  Google Scholar 

  21. Zhu, Q.Y., Qin, A.K., Suganthan, P.N., Huang, G.B.: Evolutionary extreme learning machine. Pattern Recognit. 38, 1759–1763 (2005)

    Article  MATH  Google Scholar 

  22. Johnson, W.B., Lindenstrauss, J.: Extensions of Lipschitz mappings into a Hilbert space. In: Conference in Modern Analysis and Probability, pp. 189–206 (1984)

    Google Scholar 

  23. Achlioptas, D.: Database-friendly random projections. In: Proceedings of the 20th Annual Symposium on the Principles of Database Systems, pp. 274–281 (2001)

    Google Scholar 

  24. Baraniuk, R., Davenport, M., DeVore, R., Wakin, M.: A simple proof of the restricted isometry property for random matrices. Constr. Approximat. 28, 253–263 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  25. Bouveyron, C., Girard, S., Schmid, C.: High dimensional data clustering. Comput. Stat. Data Anal. 52, 502–519 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  26. Assent, I.: Clustering high dimensional data. Wiley Interdisc. Rev. Data Min. Knowl. Discovery 2(4), 340–350 (2012)

    Article  Google Scholar 

  27. Fisher, R.A.: The use of multiple measurements in Taxonomic problems. Ann. Eugenics 7, 179–188 (1936)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alok Singh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Alshamiri, A.K., Singh, A., Surampudi, B.R. (2015). Combining ELM with Random Projections for Low and High Dimensional Data Classification and Clustering. In: Ravi, V., Panigrahi, B., Das, S., Suganthan, P. (eds) Proceedings of the Fifth International Conference on Fuzzy and Neuro Computing (FANCCO - 2015). Advances in Intelligent Systems and Computing, vol 415. Springer, Cham. https://doi.org/10.1007/978-3-319-27212-2_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-27212-2_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-27211-5

  • Online ISBN: 978-3-319-27212-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics