Advertisement

Pseudoinverse Learners: New Trend and Applications to Big Data

  • Ping GuoEmail author
  • Dongbin Zhao
  • Min Han
  • Shoubo Feng
Conference paper
Part of the Proceedings of the International Neural Networks Society book series (INNS, volume 1)

Abstract

Pseudoinverse learner (PIL), a kind of multilayer neural networks (MLP) trained with pseudoinverse learning algorithm, is a novel learning framework. It has drawn increasing attention in the areas of large-scale computing, high-speed signal processing, artificial intelligence and so on. In this paper, we briefly review the pseudoinverse learning algorithm and discuss the characteristics as well as its variants. Some new viewpoints to PIL algorithm are presented, and currently developments of PIL algorithm for autoencoder is presented under the framework of deep learning. Some new trends on PIL-based learning are also discussed. Moreover, we present several interesting PIL applications to demonstrate the practical advances on the big data analysis topic.

Keywords

Back propagation Pseudoinverse learning Multi-layer perception Deep neural network Big astronomical data 

References

  1. 1.
    Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, Oxford (1995)zbMATHGoogle Scholar
  2. 2.
    Rumelhart, D., McClelland, J.: Learning internal representations by error propagation. In: Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations, pp. 318–362. MIT Press (1986)Google Scholar
  3. 3.
    Lecun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)CrossRefGoogle Scholar
  4. 4.
    Wessels, L., Barnard, E.: Avoiding false local minima by proper initialization of connections. IEEE Trans. Neural Netw. 3(6), 899–905 (1992)CrossRefGoogle Scholar
  5. 5.
    Guo, P., Chen, CLP., Sun, Y.: An exact supervised learning for a three-layer supervised neural network. In: Proceedings of the International Conference on Neural Information Processing, pp. 1041–1044 (1995)Google Scholar
  6. 6.
    Guo, P., Lyu, M.: Pseudoinverse learning algorithm for feedforward neural networks. In: Advances in Neural Networks and Applications, pp. 321–326 (2001)Google Scholar
  7. 7.
    Guo, P.: A VEST of the pseudoinverse learning algorithm (2018). arXiv preprint: https://arxiv.org/abs/1805.07828
  8. 8.
    Xu, J., Boddeti, V.N., Savvides, M.: Perturbative neural networks. In: IEEE Conference on Computer Vision and Pattern Recognition (2018)Google Scholar
  9. 9.
    How many hidden units should I use? ftp://ftp.sas.com/pub/neural/FAQ3.html#A_hu, copyright (1997, 1998, 1999, 2000, 2001, 2002)Google Scholar
  10. 10.
    Guo, P., Lyu, M.: A pseudoinverse learning algorithm for feedforward neural networks with stacked generalization applications to software reliability growth data. Neurocomputing 56(1), 101–121 (2004)CrossRefGoogle Scholar
  11. 11.
    Pao, Y.H., Takefuji, Y.: Functional-link net computing: theory, system architecture, and functionalities. Computer 25(5), 76–79 (1992)CrossRefGoogle Scholar
  12. 12.
    Suganthan, P.N.: Letter: on non-iterative learning algorithms with closed-form solution. Appl. Soft Comput. 70(1), 1078–1082 (2018)CrossRefGoogle Scholar
  13. 13.
    Bishop, C.M.: Training with noise is equivalent to Tikhonov regularization. Neural Comput. 7(1), 108–116 (1995)CrossRefGoogle Scholar
  14. 14.
    Wang, K., Guo, P., Xin X., et al.: Autoencoder, low rank approximation and pseudoinverse learning algorithm. In: 2017 IEEE International Conference on Systems, Man, and Cybernetics, pp. 948–953 (2017)Google Scholar
  15. 15.
    Xu, B., Guo, P.: Pseudoinverse learning algorithm for fast sparse autoencoder training. In: 2018 IEEE Congress on Evolutionary Computation, pp. 1–6 (2018)Google Scholar
  16. 16.
    Xu, B., Guo, P.: Broad and pseudoinverse learning for autoencoder. In: IEEE International Conference on Systems, Man, and Cybernetics (2018)Google Scholar
  17. 17.
    Guo, P., Lyu, M.: A case study on stacked generalization with software reliability growth modeling data. In: Proceedings of ICONIP 2001, pp. 1321–1326 (2001)Google Scholar
  18. 18.
    Li, S., Feng, S., Guo, P., et al.: A hierarchical model with pseudoinverse learning algorithm optimization for pulsar candidate selection. In: 2018 IEEE Congress on Evolutionary Computation, pp. 1–6 (2018)Google Scholar
  19. 19.
    Wang, K., Guo, P., Luo, A. L., et al.: Deep neural networks with local connectivity and its application to astronomical spectral data. In: IEEE International Conference on Systems, Man, and Cybernetics, pp. 2687–2692 (2016)Google Scholar
  20. 20.
    Wang, K., Guo, P., Luo, A.L.: A new automated spectral feature extraction method and its application in spectral classification and defective spectra recover. Mon. Not. R. Astron. Soc. 465(4), 4311–4324 (2016)CrossRefGoogle Scholar
  21. 21.
    Deng, L., He, X., Gao, J.: Deep stacking networks for information retrieval. In: IEEE International Conference on Acoustics, Speech and Signal Processing (2013)Google Scholar
  22. 22.
    Guo, P.: Automatic determination of multi-layer perception neural net structure with pseudoinverse learning algorithm. In: Tutorial of ICONIP 2017 (2017). http://sss.bnu.edu.cn/~pguo/pdf/2017/tutorial_ICONIP17.pdf

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.School of Systems ScienceBeijing Normal UniversityBeijingChina
  2. 2.Institute of AutomationChinese Academy of SciencesBeijingChina
  3. 3.Faculty of Electronic Information and Electrical EngineeringDalian University of TechnologyDalianChina

Personalised recommendations