Training Minimal Uncertainty Neural Networks by Bayesian Theorem and Particle Swarm Optimization

  • Yan Wang
  • Chun-Guang Zhou
  • Yan-Xin Huang
  • Xiao-Yue Feng
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3316)

Abstract

A new model of minimal uncertainty neural networks (MUNN) is proposed in this article. The model is based on the Minimal Uncertainty Adjudgment to construct the structure, and it combines with Bayesian Theorem and Particle Swarm Optimization (PSO) for training. The model can determine the parameters of neural networks rapidly and efficiently. The effectiveness of the algorithm is demonstrated through the classification of the taste signals of 10 kinds of tea. The simulated results show its feasibility and validity.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Chun-Guang, Z., Yan-Chun, L., Yong, T., Cheng-Quan, H., Xi, S.: A Study of Identification of Taste Signals Based on Fuzzy Neural Networks. Journal of Computer Research & Development (in Chinese) 36(4), 401–409 (1999)Google Scholar
  2. 2.
    Johansen, M.M.: Evolving Neural Networks for Classification. Topics of Evolutionary Computation, Collection of Student Reports, pp. 57–63 (2002)Google Scholar
  3. 3.
    Philip, N.S., Babu Joseph, K.: Boosting the Differences: A Fast Bayesian Classifier Neural Network. In: Intelligent Data Analysis, vol. 4, pp. 463–473. IOS press, Netherlands (2000)Google Scholar
  4. 4.
    Kupinski, M.A., Edwards, D.C., Giger, M.L., Metz, C.E.: Ideal Observer Approximation Using Bayesian Classification Neural Networks. IEEE Transactions On Medical Imaging 20(9), 886–899 (2001)CrossRefGoogle Scholar
  5. 5.
    Vesterstrom, J., Riget, J.: Particle Swarms.Extensions for improved local, multimodal, and dynamic search in numerical optimization. [master thesis]. EVALife Group, Department of Computer Science Ny Munkegade, Bldg,540, University of Aarhus DK-800 Aarhus C.Denmark (May 2002)Google Scholar
  6. 6.
    Gudise, V.G., Venayagamoorthy, G.K.: Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. In: Proceedings of the IEEE Swarm Intelligence Symposium 2003 (SIS 2003), Indianapolis, Indiana, USA, pp. 110–117 (2003)Google Scholar
  7. 7.
    Watanabe, S., Yokoyama, A., Wenyi, Z., Abiko, T., Uchida, H., Katsube, T.: Detection Mechanism of Taste Signals with Commercial Ion Sensors. The Journal of the Institute of Electrical Engineers of Japan (in Japanese) CS-98-30, 13–18 (1998)Google Scholar
  8. 8.
    Holst, A.: The Use of a Bayesian Neural Network Model for Classification Tasks. [PhD dissertation]. Department of Numerical Analysis and Computing Science, Royal Institute of Technology, Stockholm, Sweden (September 1997)Google Scholar
  9. 9.
    Kennedy, J., Eberhart, R.C.: Particle Swarm Optimization. Proc. IEEE Int’l. conf. on neural networks, vol. IV, IEEE service center, Piscataway, NJ, 1942-1948 (1995)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Yan Wang
    • 1
  • Chun-Guang Zhou
    • 1
  • Yan-Xin Huang
    • 1
  • Xiao-Yue Feng
    • 1
  1. 1.College of Computer Science and TechnologyJiLin University ChangChunChina

Personalised recommendations