Abstract
With the advent of big data, massive amounts of high-dimensional data have been accumulated in many fields. The assimilation and processing of such high-dimensional data can be particularly challenging. Manifold learning offers a means for effectively dealing with this challenge. However, the results of applying manifold learning to supervised classification have remained unsatisfactory. The out-of-sample extension problem is a critical issue that must be properly solved in this regard. Genetic algorithms (GAs) have excellent global search capabilities. This paper proposes a generalized regression neural network (GRNN) optimized by a GA for the solution of the out-of-sample extension problem. The prediction performance of a GRNN mainly depends on the appropriateness of the chosen smoothing factor. The essence of the GA optimization is the determination of the optimal smoothing factor of the GRNN, the optimized form of which is subsequently used to forecast the low-dimensional embeddings of the test samples. A GA can be used to obtain a better smoothing factor in a larger search space, resulting in enhanced prediction performance. Experiments were performed to enable a detailed analysis of the important parameters that affect the performance of the proposed algorithm. The results confirmed the effectiveness of the algorithm.
Similar content being viewed by others
References
Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326
Saul LK, Roweis ST (2003) Think globally, fit locally: unsupervised leaning of low dimensional manifolds. J Mach Learn Res 4:119–155
Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290:2319–2323
Belhumeur PN, Hespanha JP, Kriegman DJ (1997) Eigenfaces versus fisherfaces: recognition using class specific linear projection. IEEE Trans Pattern Anal Mach Intell 19(5):711–720
Jolliffe IT (1986) Principle component analysis. Springer, Berlin
Xiao R, Zhao QJ, Zhang D, Shi PF (2011) Facial expression recognition on multiple manifolds. Pattern Recognit 44(1):107–116
Lafon S, Keller Y, Coifman RR (2006) Data fusion and multicue data matching by diffusion maps. IEEE Trans Pattern Anal Mach Intell 28(11):1784–1797
Chang Y, Hu C, Rogerio F, Matthew T (2006) Manifold based analysis of facial expression. Image Vision Comput 24(6):605–614
Yang W, Sun C, Zhang L (2011) A multi-manifold discriminant analysis method for image feature extraction. Pattern Recognit 44(8):1649–1657
Lafon S, Lee AB (2006) Diffusion maps and coarse-graining: a unified framework for dimensionality reduction, graph partitioning and data set parameterization. IEEE Trans Pattern Anal Mach Intell 28(9):1393–1403
Orsenigo C, Vercellis C (2012) Kernel Ridge regression for out-of-sample mapping in supervised manifold learning. Expert Syst Appl 39:7757–7762
Raducanu B, Dornaika F (2014) Embedding new observations via sparse-coding for non-linear manifold learning. Pattern Recognit 47:480–492
Weng L, Dornaika F, Jin Z (2016) Flexible constrained sparsity preserving embedding. Pattern Recognit 60:813–823
Vural E, Guillemot C (2016) Out-of-sample generalizations for supervised manifold learning for classification. IEEE Trans Image Process 25(3):1410–1424
Huang G-B (2015) What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput 7:263–278
Huang G, Huangb G-B, Song S, You K (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48
Quispe AM, Petitjean C, Heutte L (2016) Extreme learning machine for out-of-sample extension in Laplacian eigenmaps. Pattern Recognit Lett 74:68–73
Liu X, Lin S, Fang J, Zongben X (2015) Is extreme learning machine feasible? A theoretical assessment (Part I). IEEE Trans Neural Netw Learn Syst 26(1):7–20
Lin S, Liu X, Fang J, Xu Z (2015) Is extreme learning machine feasible? A theoretical assessment (part II). IEEE Trans Neural Netw Learn Syst 26(1):21–34
Huang H, Huo H, Fang T (2014) Hierarchical manifold learning with applications to supervised classification for high resolution remotely sensed images. IEEE Trans Geosci Remote Sens 52(3):1677–1692
Alpaydın E (2010) Introduction to machine learning, 2nd edn. MIT Press, Cambridge, Mass
Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323:533–536
Broomhead DS, Lowe D (1998) Multivariable functional interpolation and adaptive network. Complex Syst 2(3):321–355
Specht DF (1991) A general regression neural network. IEEE Trans Neural Netw 2(6):568–576
Schioler H, Hartmann U (1992) Mapping neural network derived from the Parzen window estimator. Neural Netw 5(6):903–909
Bagheripour P (2014) Committee neural network model for rock permeability prediction. J Appl Geophys 104:142–148
Hossain MA, Madkour AM, Dahal KP, Zhang L (2013) A real-time dynamic optimal guidance scheme using a general regression neural network. Eng Appl Artif Intell 26:1230–1236
Chang H-Y, Wen C-H, Pan W-T (2010) Prediction of the return of common fund through General Regression Neural Network. J Stat Manag Syst 13(3):627–637. https://doi.org/10.1080/09720510.2010.10701492
Holland J (1992) Genetic algorithms. Sci Am 267(1):66–72. https://doi.org/10.1038/scientificamerican0792-66
Holland J (2000) Building blocks, cohort genetic algorithms, and hyperplane-defined functions. Evol Comput 8(4):373–391
Qiu M, Ming Z, Li J, Gai K, Zong Z (2015) Phase-change memory optimization for green cloud with genetic algorithm. IEEE Trans Comput 64(12):3528–3540
Hasda RK, Bhattacharjya RK, Bennis F (2017) Modified genetic algorithms for solving facility layout problems. Int J Interact Des Manuf 11:713–725
Horton P, Jaboyedoff M, Obled C (2017) Global optimization of an analog method by means of genetic algorithms. Mon Weather Rev 145(4):1275–1294
Zang W, Ren L, Zhang W, Liu X (2018) A cloud model based DNA genetic algorithm for numerical optimization problems. Future Gener Comput Syst 81:465–477. https://doi.org/10.1016/j.future.2017.07.036
Stehman S (1997) Selecting and interpreting measures of thematic classification accuracy. Remote Sens Environ 62(1):77–89
Back T (1996) Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms. Oxford University Press, Oxford
Muhlenbein H et al (1988) Evolution algorithms in combinatorial optimization. Parallel Comput 7(1):65–85
Yang Y, Newsam S (2010) Bag-of-visual-words and spatial extensions for land-use classification. In: ACM SIGSPATIAL international conference on advances in geographic information systems (ACM GIS). http://vision.ucmerced.edu/datasets/landuse.html
Dua D, Graff C (2019) UCI machine learning repository. University of California, School of Information and Computer Science, Irvine, CA. http://archive.ics.uci.edu/ml
Chen X, Fang T, Huo H et al (2011) Graph-based feature selection for object-oriented classification in VHR airborne imagery. IEEE Trans Geosci Remote Sens 49(1):353–365
Geng X, Zhan D, Zhou Z (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Syst Man Cybern Part B Cybern 35(6):1098–1107
Acknowledgements
The authors would like to thank anonymous referees as well as the Associate Editor for their constructive comments and suggestions. They would also like to thank Editage (www.editage.com) for English language editing. This work was partially supported by the Natural Science Foundation of Fujian Province, China, under Grant 2016J01279, and the Natural Science Foundation of Education Department of Fujian Province, China, under Grant JB14003.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations
Rights and permissions
About this article
Cite this article
Huang, HB., Xie, ZH. Generalized Regression Neural Network Optimized by Genetic Algorithm for Solving Out-of-Sample Extension Problem in Supervised Manifold Learning. Neural Process Lett 50, 2567–2593 (2019). https://doi.org/10.1007/s11063-019-10022-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-019-10022-y