Gaussian Neural Networks Applied to the Cluster Analysis Problem

  • Christian Firmin
  • Denis Hamad
Conference paper
Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)


This paper describes a Gaussian neural network (GNN) applied to the cluster analysis problem. The GNN architecture is constituted by one layer of Gaussian units and one output unit which provides an estimation of the probability density function of the mixture. During the training of the network, a weighted competitive learning approach is used to estimate both the mean vector and the covariance matrix for each Gaussian function of the hidden units. The key problem with the GNN networks is the determination of the number of units in the hidden layer. This problem is solved by means of three information criteria. The interest of this approach lies in the adjusting of the number of units in an unsupervised context. Some results are reported and the performance of this approach is evaluated.


Hide Layer Radial Basis Function Network Hide Unit Output Unit Probabilistic Neural Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. AHALT, S., Krishnamurthy, A., Chen, P., and Melton, D. (1990): Competitive Leaning Algorithm for Vector Quantization. Neural Networks, 3, 277–290.CrossRefGoogle Scholar
  2. AKAIKE, H., (1974): A New Look at the Statistical Model Identification. IEEE Trans, on Automatic Control, AC-19, 6, 716–722.Google Scholar
  3. BOZDOGAN, H. (1992): Choosing the Number of Component Clusters in the Mixture-Model Using a New Informational Complexity Criterion of the Inverse-Fisher Information Matrix. Information and Classification, Concepts, Methods and Applications. Proceeding of the 16-th Conference of the “Gesellshaft für Klassification e.V.“, University of Dormund, April 1–3.Google Scholar
  4. Culter, A., and Windham, M. (1994): Information-Based Validity Functionals for Mixture Analysis. Proceeding of the First US/Japan Conference on the Frontier of Statistical Modeling: An Informational Approach, 149–170. Kluwer Academic Publishers. Printed in the Netherlands.Google Scholar
  5. DARKEN, C., and MOODY, J. (1991): Note on Learning Rate Schedules for Stochastic Optimization. Advances in Neural Networks Information Processing Systems 3, Morgan Kauffmann Publishers, 1991.Google Scholar
  6. DELSERT, S., HAMAD, D., DAOUDI, M., and POSTAIRE, J.-G. (1993): Competitive Learning Neural Networks Applied to Multivariate Data Set Reduction. IEEE Int. Conf. on SMC, 4, 496–500, Le Touquet, France.Google Scholar
  7. Duda, R. and Hart, P. (1973): Pattern Classification and Scene Analysis. New York: John Wiley k Sons.Google Scholar
  8. FIRMIN, C. and HAMAD, D., (1994): Gaussian Based Neural Networks Applied to Pattern Classification and Multivariate Probability Density Estimation. WCCI’94, IEEE International Conference on Neural Networks, Orlando, Florida, June 26-July 2.Google Scholar
  9. KOHONEN, T. (1990): The Self Organizing Map. Proceedings of IEEE (9), 1464- 1479.Google Scholar
  10. MOODY, J., and DARKEN, C. (1989): Fast Learning in Networks of Locally- Tuned Processing Units. Neural Computation, 1, 281–294.CrossRefGoogle Scholar
  11. PAL, N., BEZDEK, C., and TSAO, E. (1993): Generalized Clustering Networks and Kohonen’s Self-Organizing Scheme, IEEE Trans, on Neural Networks, 4, 549–557.CrossRefGoogle Scholar
  12. POSTAIRE, J.-G., and VASSEUR, C. (1981): An Approximate Solution to Normal Mixture Identification with Application to Unsupervised Pattern Classification. IEEE Trans, on Pattern Analysis and Machine Intelligence, PAMI-3, 2, 163–179.CrossRefGoogle Scholar
  13. SPECHT, D. (1990): Probabilistic Neural Networks. Neural Networks, 3, 109- 118CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin · Heidelberg 1996

Authors and Affiliations

  • Christian Firmin
    • 1
  • Denis Hamad
    • 1
  1. 1.Centre d’Automatique de Lille, Bâtiment P2Université des Sciences et Technologies de LilleVilleneuve d’Ascq, CedexFrance

Personalised recommendations