Optimizing a Higher Order Neural Network Through Teaching Learning Based Optimization Algorithm

  • Janmenjoy NayakEmail author
  • Bighnaraj Naik
  • H. S. Behera
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 410)


Higher order neural networks pay more attention due to greater computational capabilities with good learning and storage capacity than the existing traditional neural networks. In this work, a novel attempt has been made for effective optimization of the performance of a higher order neural network (in particular Pi-Sigma neural network) for classification purpose. A newly developed population based teaching learning based optimization algorithm has been used for efficient training of the neural network. The performance of the model has been benchmarked against some well recognized optimized models and they have tested by five well recognized real world bench mark datasets. The simulating results demonstrated favorable classification accuracies towards the proposed model as compared to others. Also from the statistical test, the results of the proposed model are quite interesting than others, which analyzes for fast training with stable and reliable results.


Higher order neural network Pi-Sigma neural network Teaching learning based algorithm (TLBO) 



This work is supported by Department of Science and Technology (DST), Ministry of Science and Technology, New Delhi, Govt. of India, under grants No. DST/INSPIRE Fellowship/2013/585.


  1. 1.
    Ivakhnenko, A.G.: Polynomial theory of complex systems polynomial theory of complex systems. IEEE Trans. Syst. Man Cybern. 1(4), 364–378 (1971)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Giles, C.L., Maxwell, T.: Learning, invariance, and generalization in high-order neural networks. Appl. Optics, 26(23), 4972–4978 (1987). ISI:A1987L307700009Google Scholar
  3. 3.
    Bengtsson, M.: Higher order artificial neural networks. Diane Publishing Company, Darby PA, USA (1990). ISBN 0941375927Google Scholar
  4. 4.
    Zhang, M., Xu, S.X., Fulcher, J.: Neuron-adaptive higher order neural-network models for automated financial data modeling. IEEE Trans. Neural Netw. 13(1), 188–204 (2002). WOS: 000 173440100016Google Scholar
  5. 5.
    Naik, B., Nayak, J., Behera, H.S.: A honey bee mating optimization based gradient descent learning–FLANN (HBMO-GDL-FLANN) for Classification. In: Emerging ICT for Bridging the Future-Proceedings of the 49th Annual Convention of the Computer Society of India CSI, vol. 2, pp. 211–220. Springer International Publishing (2015)Google Scholar
  6. 6.
    Naik, B., Nayak, J., Behera, H.S., Abraham, A.: A harmony search based gradient descent learning-FLANN (HS-GDL-FLANN) for Classification. In: Computational Intelligence in Data Mining, vol. 2, pp. 525–539. Springer India (2015)Google Scholar
  7. 7.
    Shin, Y., Ghosh, J.: The pi-sigma networks: an efficient higher order neural network for pattern classification and function approximation. In: Proceedings of International Joint Conference on Neural Networks, vol. 1, pp. 13–18. Seattle, Washington, July 1991Google Scholar
  8. 8.
    Hussain, A.J., Liatsis, P.: Recurrent pi-sigma networks for DPCM image coding. Neurocomputing 55, 363–382 (2002)CrossRefGoogle Scholar
  9. 9.
    Li, C.-K.: Memory-based sigma-pi-sigma neural network. IEEE SMC, TP1F5, pp. 112–118 (2002)Google Scholar
  10. 10.
    Weber, C., Wermter, S.: A self-organizing map of sigma–pi units. Neurocomputing 70, 2552–2560 (2007)CrossRefGoogle Scholar
  11. 11.
    Ghazali, R., Hussain, A., El-Deredy, W.: Application of ridge polynomial neural networks to financial time series prediction. In: 2006 International Joint Conference on Neural Networks, pp. 913–920, July 16–21, 2006Google Scholar
  12. 12.
    Nie, Y., Deng, W.: A hybrid genetic learning algorithm for Pi-sigma neural network and the analysis of its convergence. In: IEEE Fourth International Conference on Natural Computation, 2008, pp. 19–23Google Scholar
  13. 13.
    Song, G., Peng, C., Miao, X.: Visual cryptography scheme using pi-sigma neural networks. In: 2008 International Symposium on Information Science and Engineering, pp. 679–682Google Scholar
  14. 14.
    Nayak, J., Naik, B., Behera, H.S.: A novel chemical reaction optimization based higher order neural network (CRO-HONN) for nonlinear classification. Ain Shams Eng. J. (2015)Google Scholar
  15. 15.
    Nayak, J., Naik, B., Behera, H.S.: A novel nature inspired firefly algorithm with higher order neural network: performance analysis. Eng. Sci. Technol. Int. J. (2015)Google Scholar
  16. 16.
    Nayak, J., Naik, B., Behera, H.S.: A hybrid PSO-GA based Pi sigma neural network (PSNN) with standard back propagation gradient descent learning for classification. In: 2014 International Conference on Control, Instrumentation, Communication and Computational Technologies (ICCICCT). IEEE (2014)Google Scholar
  17. 17.
    Nayak, J., Kanungo, D.P., Naik, B., Behera, H.S.: A higher order evolutionary Jordan Pi-Sigma neural network with gradient descent learning for classification. In: 2014 International Conference on High Performance Computing and Applications (ICHPCA), pp. 1–6. IEEE (2014)Google Scholar
  18. 18.
    Rao, R.V., Savsani, V.J., Vakharia, D.P.: Teaching–learning-based optimization: an optimization method for continuous non-linear large scale problems. Inf. Sci. 183(1), 1–15 (2012)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Rao, R.V., Kalyankar, V.D.: Parameter optimization of modern machining processes using teaching–learning based optimization algorithm. Eng. Appl. Artif. Intel. 26(1), 524–531 (2013)CrossRefGoogle Scholar
  20. 20.
    Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(9), 533–536 (1986)CrossRefGoogle Scholar
  21. 21.
    Alcalá-Fdez, J., Fernandez, A., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.: KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Logic Soft Comput. 17(2–3), 255–287 (2011)Google Scholar
  22. 22.
    Nayak, J., et al.: Particle swarm optimization based higher order neural network for classification. Comput. Intell. Data Mining. Springer India. 1, 401–414 (2015)Google Scholar

Copyright information

© Springer India 2016

Authors and Affiliations

  • Janmenjoy Nayak
    • 1
    Email author
  • Bighnaraj Naik
    • 1
  • H. S. Behera
    • 1
  1. 1.Department of Computer Science Engineering and Information TechnologyVeer Surendra Sai University of TechnologyBurlaIndia

Personalised recommendations