Abstract
Extreme learning machine (ELM), as a new learning method for training feedforward neural networks, has shown its good generalization performance in regression and classification applications. Random projection (RP), as a simple and powerful technique for dimensionality reduction, is used for projecting high-dimensional data into low-dimensional subspaces while ensuring that the distances between data points are approximately preserved. This paper presents a systematic study on the application of RP in conjunction with ELM for both low- and high-dimensional data classification and clustering.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, vol. 2, pp. 985–990 (2004)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70, 489–501 (2006)
Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Syst. 2(3), 321–355 (1988)
Schmidt, W.F., Kraaijveld, M.A., Duin, R.P.W.: Feedforward neural networks with random weights. In: Proceedings of 11th IAPR International Conference on Pattern Recognition Methodology and Systems, Hague, Netherlands, pp. 1–4 (1992)
Pao, Y.H., Park, G.H., Sobajic, D.J.: Learning and generalization characteristics of random vector functional-link net. Neurocomputing 6, 163–180 (1994)
Huang, G.B., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. B Cybern. 42, 513–529 (2012)
Frénay, B., Verleysen, M.: Using SVMs with randomised feature spaces: an extreme learning approach. In: Proceedings of 18th European Symposium on Artificial Neural Networks (ESANN), Bruges, Belgium, pp. 315–320 (2010)
He, Q., Jin, X., Du, C., Zhuang, F., Shi, Z.: Clustering in extreme learning machine feature space. Neurocomputing 128, 88–95 (2014)
Alshamiri, A.K., Singh, A., Surampudi, B.R.: A novel ELM K-means algorithm for clustering. In: Proceedings of 5th International Conference on Swarm, Evolutionary and Memetic Computing (SEMCCO), Bhubaneswar, India, pp. 212–222 (2014)
Alshamiri, A.K., Singh, A., Surampudi, B.R.: Artificial bee colony algorithm for clustering: an extreme learning approach. Soft Comput. (2015)
Fradkin, D., Madigan, D.: Experiments with random projections for machine learning. In: Proceedings of the 9th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 517–522 (2003)
Deegalla, S., Bostrom, H.: Reducing high-dimensional data by principal component analysis vs. random projection for nearest neighbor classification. In: Proceedings of the 5th International Conference on Machine Learning and Applications (ICMLA), Orlando, Fl, pp. 245–250 (2006)
Fern, X.Z., Brodley, C.E.: Random projection for high dimensional data clustering: a cluster ensemble approach. In: Proceedings of the 20th International Conference of Machine Learning (ICML), Washington DC, pp. 186–193 (2003)
Cardoso, A., Wichert, A.: Iterative random projections for high-dimensional data clustering. Pattern Recogn. Lett. 33, 1749–1755 (2012)
Gastaldo, P., Zunino, R., Cambria, E., Decherchi, S.: Combining ELM with random projections. IEEE Intell. Syst. 28(6), 18–20 (2013)
Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks 17(4), 879–892 (2006)
Huang, G.B.: Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Trans. Neural Networks 14(2), 274–281 (2003)
Serre, D.: Matrices: Theory and Applications. Springer, NewYork (2002)
Huang, G.B., Ding, X., Zhou, H.: Optimization method based extreme learning machine for classification. Neurocomputing 74, 155–163 (2010)
Lan, Y., Soh, Y.C., Huang, G.B.: Constructive hidden nodes selection of extreme learning machine for regression. Neurocomputing 73, 3191–3199 (2010)
Zhu, Q.Y., Qin, A.K., Suganthan, P.N., Huang, G.B.: Evolutionary extreme learning machine. Pattern Recognit. 38, 1759–1763 (2005)
Johnson, W.B., Lindenstrauss, J.: Extensions of Lipschitz mappings into a Hilbert space. In: Conference in Modern Analysis and Probability, pp. 189–206 (1984)
Achlioptas, D.: Database-friendly random projections. In: Proceedings of the 20th Annual Symposium on the Principles of Database Systems, pp. 274–281 (2001)
Baraniuk, R., Davenport, M., DeVore, R., Wakin, M.: A simple proof of the restricted isometry property for random matrices. Constr. Approximat. 28, 253–263 (2008)
Bouveyron, C., Girard, S., Schmid, C.: High dimensional data clustering. Comput. Stat. Data Anal. 52, 502–519 (2007)
Assent, I.: Clustering high dimensional data. Wiley Interdisc. Rev. Data Min. Knowl. Discovery 2(4), 340–350 (2012)
Fisher, R.A.: The use of multiple measurements in Taxonomic problems. Ann. Eugenics 7, 179–188 (1936)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Alshamiri, A.K., Singh, A., Surampudi, B.R. (2015). Combining ELM with Random Projections for Low and High Dimensional Data Classification and Clustering. In: Ravi, V., Panigrahi, B., Das, S., Suganthan, P. (eds) Proceedings of the Fifth International Conference on Fuzzy and Neuro Computing (FANCCO - 2015). Advances in Intelligent Systems and Computing, vol 415. Springer, Cham. https://doi.org/10.1007/978-3-319-27212-2_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-27212-2_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-27211-5
Online ISBN: 978-3-319-27212-2
eBook Packages: Computer ScienceComputer Science (R0)