Advertisement

Analyzing Weight Distribution of Feedforward Neural Networks and Efficient Weight Initialization

  • Jinwook Go
  • Byungjoon Baek
  • Chulhee Lee
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3138)

Abstract

In this paper, we investigate and analyze the weight distribution of feedforward two-layer neural networks in order to understand and improve the time-consuming training process of neural networks. Generally, it takes a long time to train neural networks. However, when a new problem is presented, neural networks have to be trained again without any benefit from previous training. In order to address this problem, we view training process as finding a solution weight point in a weight space and analyze the distribution of solution weight points in the weight space. Then, we propose a weight initialization method that uses the information on the distribution of the solution weight points. Experimental results show that the proposed weight initialization method provides a better performance than the conventional method that uses a random generator in terms of convergence speed.

Keywords

Neural Network Classification Accuracy Weight Distribution Hide Neuron Feedforward Neural Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Wessels, L.F.A., Barnard, E.: Avoiding false local minima by proper initialization of connections. IEEE Trans. Neural Networks 3, 899–905 (1992)CrossRefGoogle Scholar
  2. 2.
    Drago, G.P., Ridella, S.: Statistically controlled activation weight initialization (SCAWI). IEEE Trans. Neural Networks 3, 627–631 (1992)CrossRefGoogle Scholar
  3. 3.
    Chen, Z., Feng, T., Houkes, Z.: Incorporating a priori knowledge into initialized weights for neural classifier. Proc. Int. Joint Conf. Neural Networks 2, 291–296 (2000)Google Scholar
  4. 4.
    Nowlan, S.J., Hinton, G.E.: Simplifying neural networks by soft weight-sharing. Neural Computation 4(4), 473–493 (1992)CrossRefGoogle Scholar
  5. 5.
    Bellido, I., Fiesler, E.: Do backpropagation trained neural networks have normal weight distributions? In: Proc. Int. Conf. Artificial Neural Networks, pp. 772–775 (1993)Google Scholar
  6. 6.
    Richards, J.A.: Remote Sensing Digital Image Analysis. Springer, Heidelberg (1993)Google Scholar
  7. 7.
    Lippmann, R.P.: An introduction to computing with neural nets. IEEE ASSP Magazine 4, 4–22 (1987)CrossRefGoogle Scholar
  8. 8.
    Biehl, L.L., et al.: A crops and soils data base for scene radiation research. In: Proc. Machine Process. of Remotely Sensed Data Symp., West Lafayette, Indiana, pp. 169–177 (1982)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Jinwook Go
    • 1
  • Byungjoon Baek
    • 1
  • Chulhee Lee
    • 1
  1. 1.Department of Electrical and Electronic EngineeringBERC, Yonsei UniversitySeoulKOREA

Personalised recommendations