Advertisement

Improving the Model Convergence Properties of Classifier Feed-Forward MLP Neural Networks

  • Annamária R. Várkonyi-Kóczy
  • Balázs Tusor
  • József Bukor
Chapter
Part of the Studies in Fuzziness and Soft Computing book series (STUDFUZZ, volume 317)

Abstract

Recently, the application of Artificial Neural Networks (ANNs) has become very popular. Their success is due to the fact that they are able to learn complex input-output mappings and are able to find relationships in unstructured data sets. Further, neural nets are relatively easy to implement in any application. In the last years, classification has become one of the most significant research and application area of ANNs because these networks have proved to be very efficient in the field. Unfortunately, a big difficulty of the usage of feed-forward multilayer perceptron (MLP) neural nets with supervised learning is that in case of higher problem complexity, the NN model may not converge during the training or in better cases needs a long training time which scales with the structural parameters of the networks and the quantity of input data. However, the training can be done off-line, this disadvantage may limit the usage of NN models because the training has a non-negligible cost and further, can cause a possibly non-tolerable delay in the operation. In this chapter, to overcome these problems, a new training algorithm is proposed which in many cases is able to improve the convergence properties of NN models in complex real world classification problems. On one hand, the accuracy of the models can be increased while on the other hand the training time can be decreased. The new training method is based on the well-known back-propagation algorithms, however with a significant difference: instead of the original input data, a reduced data set is used during the teaching phase. The reduction is the result of a complexity optimized classification procedure. In the resulted new, reduced input data set, each input sample is replaced by the center of the cluster to which it belongs and these cluster centers are used during the training (each element once). As result, new, complex ambiguous classification problems can be solved with acceptable cost and accuracy by using feed-forward MLP NNs.

Keywords

Feed-forward artificial neural networks Neural network training Supervised learning Back-propagation algorithm Classification Clustering Model convergence 

Notes

Acknowledgment

This work was sponsored by the Hungarian National Scientific Fund (OTKA 78576).

References

  1. 1.
    Chun, B.B., Jafri, M.Z.M., San, L.H.: Mangrove mapping in Penang Island by using Artificial Neural Network technique. In: Proceedings of the IEEE Conference on Open Systems (ICOS), pp. 245–249, 25–28 Sept 2011Google Scholar
  2. 2.
    Murray, J.C., Erwin, H.R.: A neural network classifier for notch filter classification of sound-source elevation in a mobile robot. In: Proceedings of the 2011 International Joint Conference on Neural Networks (IJCNN), pp. 763–769, 31 July–5 Aug 2011Google Scholar
  3. 3.
    Pradeep, J., Srinivasan, E., Himavathi, S.: Neural network based handwritten character recognition system without feature extraction. In: Proceedings of the International Conference on Computer, Communication and Electrical Technology (ICCCET), pp. 40–44, 18–19 March 2011Google Scholar
  4. 4.
    Aloise, D., Deshpande, A., Hansen, P., Popat, P.: NP-hardness of Euclidean sum-of-squares clustering. Mach. Learn. 75, 245–249 (2009)CrossRefGoogle Scholar
  5. 5.
    Várkonyi-Kóczy, A.R., Tusor, B.: Improving the supervised learning of neural network based classification. In: Proceedings of the IEEE International Symposium on Intelligent Signal Processing, WISP2011, Floriana, Malta, 19–21 Sept 2011Google Scholar
  6. 6.
    Ishibushi, H., Tanaka, H.: Fuzzy neural networks with fuzzy weights and fuzzy biases. In: Proceedings of the IEEE Neural Network Conference, San Francisco, USA, vol. 3, pp. 1650–1655 (1993)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Annamária R. Várkonyi-Kóczy
    • 1
  • Balázs Tusor
    • 2
  • József Bukor
    • 3
  1. 1.Institute of Mechatronics and Vehicle EngineeringÓbuda UniversityBudapestHungary
  2. 2.Integrated Intelligent Systems Japanese-Hungarian LaboratoryÓbuda UniversityBudapestHungary
  3. 3.Department of Mathematics and InformaticsJ. Selye UniversityKomarnoSlovakia

Personalised recommendations