Local search and pseudoinversion: an hybrid approach to neural network training
- 199 Downloads
We consider recent successful techniques proposed for neural network training that set randomly the weights from input to hidden layer, while weights from hidden to output layer are analytically determined by Moore–Penrose generalized inverse. This study aimed to analyse the impact on performances when the completely random sampling of the space of input weights is replaced by a local search procedure over a discretized set of weights. The performances of the proposed training methods are assessed through computational experience on several UCI datasets.
KeywordsNeural networks Random projections Local search Pseudoinverse matrix
The activity has been partially carried on in the context of the Visiting Professor Program of the Gruppo Nazionale per il Calcolo Scientifico (GNCS) of the Italian Istituto Nazionale di Alta Matematica (INdAM).
- 2.Achlioptas D (2001) Database-friendly random projections. In: Proceedings of the 20th ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems, pp 274–281Google Scholar
- 3.Ajorloo H, Manzuri-Shalmani MT, Lakdashti A (2007) Restoration of damaged slices in images using matrix pseudo inversion. In: 22nd international symposium on computer and information sciencesGoogle Scholar
- 4.Asuncion A, Newman DJ (2007) UCI machine learning repository. School of Information and Computer Sciences, University of California, Irvine. http://www.ics.uci.edu/~mlearn/MLRepository.html
- 5.Badeva V, Morosov V (1991) Problemes incorrectements posès, thèorie et applications. Masson, ParisGoogle Scholar
- 6.Cancelliere R, Deluca R, Gai M, Gallinari P, Rubini L (2015) An analysis of numerical issues in neural training based on pseudoinversion. Comput Appl Math. doi: 10.1007/s40314-015-0246-z
- 16.Nguyen TD, Pham HTB, Dang VH (2010) An efficient Pseudo Inverse matrix-based solution for secure auditing. In: IEEE international conference on computing and communication technologies, research, innovation, and vision for the futureGoogle Scholar
- 18.Rubini L, Cancelliere R, Gallinari P, Grosso A, Raiti A (2014) Computational experience with pseudoinversion-based training of neural networks using random projection matrices. In: Proceedings of the 16th international conference on artificial intelligence: methodology, systems, applications, AIMSA 2014. Lecture notes in computer science, Vol 8722, pp 236–245Google Scholar
- 19.Rumelhart DE, Hinton GE, Williams RJ (1996) Learning internal representations by error propagation. Parallel distributed processing: explorations in the microstructure of cognition, vol 1. MIT Press Cambridge, MA, USAGoogle Scholar
- 23.Wang X (2013) Editorial: Special issue on extreme learning machine with uncertainty. Int J Uncertain Fuzziness Knowl Based Syst 21:v-viGoogle Scholar
- 24.Wanyu D, Zheng Q, Chen L (2009) Regularised extreme learning machine. In: Proceedings of the IEEE symposium on computational intelligence and data mining, pp 389–395Google Scholar