Advertisement

Classification and feature selection by a self-organizing neural network

  • Arnaud Ribert
  • Emmanuel Stocker
  • Abdel Ennaji
  • Yves Lecourtier
Plasticity Phenomena (Maturing, Learning & Memory)
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1606)

Abstract

This article describes recent improvements of an original neural network building method which could be applied in the particular case of 2 input neurones. After a brief recall of the main building principles of a neural net, authors introduce the capability for a neurone to receive more than 2 inputs. Two problems then arise: how to chose the input number of a neurone, and what becomes of the decision rule of a neurone? Treating these problems leads to an original feature selection method (based on genetic algorithms) and leads to adapt a linear discrimination algorithm to non separable problems. Experimental results for a handwritten digit recognition problem confirm the efficiency of the method.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography

  1. [BEL61] Bellman R., “Adaptive control processes”, Princeton University Press, Princeton, New Jersey, 1961.CrossRefMATHGoogle Scholar
  2. [BEL91] Belew R.K., Mc Inerney J., Schraudolph N.N., “Evolving network: using the genetic algorithm with connectionnist learning”. In Langton C.G., Taylor C., Farmer, J.D., Rasmussen (Eds.), Artificial life II, SFI studies in the sciences of complexity, Vol. X, Addison-Wesley, Reading, pp. 511–547, 1991.Google Scholar
  3. [CHE95] Chentouf R., Jutten C., “Ineremental learning with a stopping creterion experimental results”. In J. Mira, F. Sandoval (Eds.), Proceedings of Iwann’95, New-York: Springer-Verlag, New-York: Springer-Verlag, pp. 519–526, 1995.Google Scholar
  4. [DEV82] Devijver P.A., Kittler, J., “Pattern recognition: a statistical approach”, Prentice-Hall, London, 1982.MATHGoogle Scholar
  5. [FRE90] Frean M., “The upstart algorithm: a method for constructing and training feedforward neural networks”, Neural Computation, Vol. 2, pp. 198–209 1990.CrossRefGoogle Scholar
  6. [GAL86] Gallant S.I., “Optimal linear discriminants”, In IEEE Proceedings of the 8 th Conference on Pattern Recognition Vol. 2 (pp. 849). New-York: IEEE, 1986.Google Scholar
  7. [GOL89] Goldberg D.E., “Genetic algorithms in search, optimization, and machine learning”. Reading, Massachusetts: Addison-Wesley, 1989.MATHGoogle Scholar
  8. [HAS93] Hassibi B., Stork D.G., Wolff G.J., “Optimal brain surgeon”. In Proceedings of the 1993 IEEE International Conference on Neural Networks Vol 1, New-York: IEEE, pp. 293–299. 1993.Google Scholar
  9. [HEU96] Heutte L., Moreau J.V., Paquet T., Lecourtier Y., Olivier C., “Combining Structural and statistical features for the recognition of handwritten characters”, 13 th Internat. Conf. on Pattern Recog., Vol. 2, Vienne, pp. 210–214, 1996.CrossRefGoogle Scholar
  10. [HIR91] Hirose Y., Yamashita K., Hijiya S., “Back-propagation algorithm which varies the number of hidden units”, Neural Networks, Vol. 4, pp. 61–66, 1991.CrossRefGoogle Scholar
  11. [HO65] Ho Y-C, Kashyap R.L., “An algorithm for linear inequalities and its applications”, IEEE Transactions on Elec. Comp., Vol. 14, pp. 683–688, 1965.CrossRefMATHGoogle Scholar
  12. [KNE90] Knerr S., Personnaz, L., Dreyfus G., “Single-layer learning revisited: a stepwise procedure for building and training a neural network”. In F. Fogelman Soulie, J. Herault (Eds.), Neurocomputing, NATO ASI Series, Series F, Vol. 68, New-York: Springer-Verlag, pp. 41–50, 1990.Google Scholar
  13. [LEC90] LeCun Y. Denker J.S., Solla S.A., “Optimal Brain Damage”, Procedings of the Neural Information Processing System-2. D.S. Touretzky Ed., Morgan-Kaufmann, pp. 598–605, 1990.Google Scholar
  14. [LIS95] Lis J., “The synthesis of the ranked neural networks applying genetic algorithm with the dynamic probability of mutation”. In J. Mira, F. Sandoval (Eds.), Proceedings of Iwann’95, New-York: Springer-Verlag, pp. 498–504, 1995.Google Scholar
  15. [MIN69] Minsky M., Papert S. “Perceptrons”, MIT Press, Cambridge (MA), 1969.MATHGoogle Scholar
  16. [RIB97] Ribert A., Stocker E., Lecourtier Y., Ennaji A. “Optimizing a Neural Network Architecture with an Adaptive Parameter Genetic Algorithm”. In Proceedings of Iwann’97, J. Mira, R. Moreno and J. Cabestany Eds, Springer Verlag, Berlin, Vol. 1240, pp. 527–535, 1997.Google Scholar
  17. [RIB98] Ribert A., “Structuration évolutive de données: application à la construction de classifieurs distribués”, Ph.D. Thesis, University of rouen, France, 1998.Google Scholar
  18. [ROS60] Rosenblatt F., “Perceptron simulation experiments”, Proceeding of the IRE, 3, 48, 1960.Google Scholar
  19. [STO95] Stocker E., Lecourtier Y., Ennaji A., “A distributed classifier based on Yprel networks cooperation”. In Proceedings of Iwann’95, pp. 330–337, 1995.Google Scholar
  20. [STO96] Stocker E., Ribert A., Lecourtier Y., Ennaji A., “An incremental distributed classifier building”. In 13th International Conference on Pattern Recognition (ICPR’96) Vol IV, (pp. 128–132). Washington: IEEE Computer Society Press, 1996.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Arnaud Ribert
    • 1
  • Emmanuel Stocker
    • 1
  • Abdel Ennaji
    • 1
  • Yves Lecourtier
    • 1
  1. 1.UFR des Sciences et TechniquesUniversité de Rouen, PSI/La3iMont Saint Aignan CedexFrance

Personalised recommendations