Advertisement

An Adaptive Sigmoidal Activation Function Cascading Neural Networks

  • Sudhir Kumar Sharma
  • Pravin Chandra
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 87)

Abstract

In this paper, we propose an adaptive sigmoidal activation function cascading neural networks. The proposed algorithm emphasizes architectural adaptation and functional adaptation during training. This algorithm is a constructive approach to building cascading architecture dynamically. To achieve functional adaptation, an adaptive sigmoidal activation function is proposed for the hidden layers’ node. The algorithm determines not only optimum number of hidden layers’ nodes, as also optimum sigmoidal function for them. Four variants of the proposed algorithm are developed and discussed on the basis of activation function used. All the variants are empirically evaluated on five regression functions in terms of learning accuracy and generalization capability. Simulation results reveal that adaptive sigmoidal activation function presents several advantages over traditional fixed sigmoid function, resulting in increased flexibility, smoother learning, better learning accuracy and better generalization performance.

Keywords

Adaptive sigmoidal activation function Cascade-correlation algorithm Constructive algorithms Dynamic node creation algorithm Weight freezing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Dote, Y., Ovaska, S.J.: Industrial Applications of Soft Computing: A Review. Proceedings of the IEEE 89(9) (2001)Google Scholar
  2. 2.
    Sedano, J., Curiel, L., Corchado, E., et al.: A soft computing method for detecting lifetime building thermal insulation failures. Integrated Computer-Aided Engineering 17(2), 103–115 (2010)Google Scholar
  3. 3.
    Yu, W.-D., Liu, Y.-C.: Hybridization of CBR and numeric soft computing techniques for mining of scarce construction databases. Automation in Construction 15(1), 33–46 (2006)CrossRefGoogle Scholar
  4. 4.
    Corchado, E., Herrero, A.: Neural visualization of network traffic data for intrusion detection. Applied Soft Computing (2010), doi:10.1016/j.asoc.2010.07.002Google Scholar
  5. 5.
    Alcalá-Fdez, J., Sánchez, L., García, S., del Jesús, M.J., Ventura, S., Garrell, J.M., Otero, J., Romero, C., Bacardit, J., Rivas, V.M., Fernández, J.C., Herrera, F.: KEEL: a software tool to assess evolutionary algorithms for data mining problems. Soft Computing 13(3), 307–318 (2009)CrossRefGoogle Scholar
  6. 6.
    Corchado, E., Arroyo, A., Tricio, V.: Soft computing models to identify typical meteorological days. Logic Journal of the IGPL (2010), doi:10.1093/jigpal/jzq035Google Scholar
  7. 7.
    Cybenko, G.: Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signal and Systems 2, 303–314 (1989)MathSciNetzbMATHCrossRefGoogle Scholar
  8. 8.
    Hornik, M.S.K., White, H.: Multilayer feedforward networks are universal approximators. Neural Networks 2(5), 359–366 (1989)CrossRefGoogle Scholar
  9. 9.
    Kwok, T.Y., Yeung, D.Y.: Constructive Algorithms for Structure Learning in feedforward Neural Networks for Regression Problems. IEEE Transactions on Neural Networks 8(3), 630–645 (1997)CrossRefGoogle Scholar
  10. 10.
    Reed, R.: Pruning algorithms-A Survey. IEEE Transactions on Neural Networks 4, 740–747 (1993)CrossRefGoogle Scholar
  11. 11.
    Fahlman, S.E., Lebiere, C.: The cascade correlation learning architecture. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing System, vol. 2, pp. 277–524. Morgan Kaufmann, CA (1990)Google Scholar
  12. 12.
    Ash, T.: Dynamic node creation in backpropagation networks. Connection Science 1(4), 365–375 (1989)CrossRefGoogle Scholar
  13. 13.
    Trentin, T.: Networks with trainable amplitude of activation functions. Neural Networks 14, 471–493 (2001)CrossRefGoogle Scholar
  14. 14.
    Singh, Y., Chandra, P.: A class +1 sigmoidal activation functions for FFANNs. J. Econ. Dyn. Control 28(1), 183–187 (2003)MathSciNetzbMATHCrossRefGoogle Scholar
  15. 15.
    Chandra, P., Singh, Y.: An activation function adapting training algorithm for sigmoidal feedforward networks. Neurocomputing 61, 429–437 (2004)CrossRefGoogle Scholar
  16. 16.
    Lehtokangas, M.: Modeling with constructive backpropagation. Neural Networks 12, 707–716 (1999)CrossRefGoogle Scholar
  17. 17.
    Kwok, T.Y., Yenug, D.Y.: Objective functions for training new hidden units in constructive neural networks. IEEE Transactions on Neural Networks 8(5), 1131–1148 (1997)CrossRefGoogle Scholar
  18. 18.
    Yamada, T., Yabuta, T.: Remarks on a neural network controller which uses an auto-tuning method for nonlinear functions. IJCNN 2, 775–780 (1992)Google Scholar
  19. 19.
    Nawi, N.M., Ransing, R.S., et al.: The effect of gain variation in improving learning speed of backpropagation neural network algorithm on classification problems. In: Symposium on Progress in Information & Communication Technology (2009)Google Scholar
  20. 20.
    Sharma, S.K., Chandra, P.: An adaptive slope sigmoidal function cascading neural networks algorithm. In: Proc. of the IEEE, ICETET 2010, India, pp. 139–144 (2010), doi:10.1109/ICETET.2010.71Google Scholar
  21. 21.
    Sharma, S.K., Chandra, P.: An adaptive slope basic dynamic node creation algorithm for single hidden layer neural networks. In: Proc. of the IEEE, CICN 2010, India, pp. 531–539 (2010), doi:10.1109/CICN.2010.38Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Sudhir Kumar Sharma
    • 1
  • Pravin Chandra
    • 2
  1. 1.Ansal Institute of TechnologyGGS Indraprastha UniversityGurgaonIndia
  2. 2.Institute of Informatics & CommunicationUniversity of DelhiNew DelhiIndia

Personalised recommendations