Skip to main content
Log in

A fast learning algorithm of neural network with tunable activation function

  • Published:
Science in China Series F: Information Sciences Aims and scope Submit manuscript

Abstract

This paper presents a modified structure of a neural network with tunable activation function and provides a new learning algorithm for the neural network training. Simulation results of XOR problem, Feigenbaum function, and Henon map show that the new algorithm has better performance than BP (back propagation) algorithm in terms of shorter convergence time and higher convergence accuracy. Further modifications of the structure of the neural network with the faster learning algorithm demonstrate simpler structure with even faster convergence speed and better convergence accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Wu Youshou, Zhao Mingsheng, The neural model with tunable activation function and its supervised learning and application, Science in China (in Chinese), Ser. E, 2001, 31(3): 263–272.

    Google Scholar 

  2. Segee, B. E., Using spectral techniques for improved performance in ANN, Proc IEEE, 1993, 500–505.

  3. Lee, S., Kil, R. M., A Gaussian potential function network with hierarchically self-organizing learning, Neural Network, 1991, 4: 207–224.

    Article  Google Scholar 

  4. Stork, D. G., Allen, J. D. et al., How to solve the N-bit parity problem with two hidden units, Neural Networks, 1992, 5: 923–926.

    Article  Google Scholar 

  5. Stork, D. G., A replay to Brown and Kom, Neural Networks, 1993, 6: 607–609.

    Article  Google Scholar 

  6. Wu, Y. S., A new approach to design a simplest ANN for performing certain special problems, in Proceedings of the International Conference on Neural Information Proceeding, Beijing, 1995, 477–480.

  7. Wu, Y. S., The research on constructing some group neural networks by the known input signal, Science in China (in Chinese), Ser. E, 1996, 26(2): 140–144.

    Google Scholar 

  8. Wu, Y. S., Zhao, M. S., Ding, X. Q., A new artificial neural network with tunable activation function and its application, Science in China (in Chinese), Ser. E, 1997, 27(1): 55–60.

    Google Scholar 

  9. Wu Youshou, Zhao Mingsheng, Ding Xiaoqing, A new perception model with tunable activation function, Chinese Journal of Electronics, 1996, 5(2): 55–62.

    Google Scholar 

  10. Lippmann, R. P., An introduction to computing neural networks, IEEE ASSP MAG., vol. 4, No. 2, Apr. 1987.

  11. Rumelhart, D. E., McClelland, J. L., Parallel Distributed Processing, Vol. 1. Cambridge, MA: M. I. T. Press, 1986.

    Google Scholar 

  12. Scalero, R. S., Tepedelenlioglu, N., A fast new algorithm for training feedforward neural networks, IEEE Trans On Signal Processing, Jan, 1992, 40(1): 203–210.

    Article  Google Scholar 

  13. Azimi-Sadjadi, M. R., Liou, R.-J., Fast learning process of multilayer neural networks using recursive least squares method, IEEE Trans on Signal Processing, Feb, 1992, 40(2): 447–450.

    Article  Google Scholar 

  14. Quyang Shan, Bao Zheng, Liao Gui-Sheng, Robust recursive least squares learning algorithm for principal component analysis, IEEE Trans. on Neural Networks, Jan 2000, 11(1): 215–221.

    Article  Google Scholar 

  15. Haykin, S., Adapt Filter Theory, Englewood Cliffs, NJ: Prentice-Hall, 1986.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shen Yanjun.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Shen, Y., Wang, B. A fast learning algorithm of neural network with tunable activation function. Sci China Ser F 47, 126–136 (2004). https://doi.org/10.1360/02yf0263

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1360/02yf0263

Keywords

Navigation