Single Layer Morphological Perceptron Solution to the N-Bit Parity Problem

  • Gonzalo Urcid
  • Gerhard X. Ritter
  • Laurentiu Iancu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3287)

Abstract

Morphological perceptrons use a lattice algebra approach to learn and classify a set of patterns. Dendritic structure combined with lattice algebra operations have properties that are completely different than those of traditional perceptron models. In the present paper, we focus our attention in single layer morphological perceptrons that classify correctly the parity of all bit strings of length n, as a one-class pattern recognition problem. The n-bit parity problem is the n-dimensional extension of the classic XOR problem in the Euclidean plane and is commonly used as a difficult benchmark to test the performance of training algorithms in artificial neural networks. We present results for values of n up to 10, obtained with a training algorithm based on elimination.

References

  1. 1.
    Thornton, C.: Parity: the problem that won’t go away. In: Proc. of Artificial Intelligence, Toronto, Canada, pp. 362–374 (1996)Google Scholar
  2. 2.
    Sontag, E.D.: Feedforward nets for interpolation and classification. Journal of Computer and System Sciences 45, 20–48 (1992)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Duch, W.: Scaling properties of neural classifiers. In: Proc. 3rd Conf. on Neural Networks and their Applications, Kule, Poland, pp. 663–670 (1997)Google Scholar
  4. 4.
    Setiono, R.: On the solution of the parity problem by using a single hidden layer feedforward neural network. Neurocomputing 16(3), 1059–1065 (1997)CrossRefGoogle Scholar
  5. 5.
    Looney, C.G.: Pattern Recognition using Neural Networks. In: Theory and Algorithms for Engineers and Scientists, Oxford University Press, New York (1997)Google Scholar
  6. 6.
    Setiono, R.: Algorithmic techniques and their applications. In: Leondes, C.T. (ed.) Neural Networks Systems, Techniques, and Applications, vol. 5, pp. 296–301. Academic Press, San Diego (1998)Google Scholar
  7. 7.
    Hjelmås, E.: A comment on the parity problem. Technical Report (7), Gjøvik University College, Norway (1999)Google Scholar
  8. 8.
    Park, C.-Y., Nakajima, K.: Majority algorithm: a formation for neural networks with the quantized connection weights. IEICE Trans. Fundamentals E83-A(6), 225–235 (2000)Google Scholar
  9. 9.
    Ghorbani, A.A., Owrangh, K.: Stacked generalization in neural networks: generalization on statistically neutral problems. In: IEEE Proc. IJCNN, Washington, DC, USA, pp. 1715–1720 (2001)Google Scholar
  10. 10.
    Wilamowski, B. M. et al.: An algorithm for fast convergence in training neural networks. IEEE Proc. IJCNN, Washington, D.C., USA (2001) 1778–1782. Google Scholar
  11. 11.
    Liu, D., Chang, T.-S., Zhang, Y.: A constructive algorithm for feedforward neural networks with incremental training. IEEE Trans. on Circuits and Systems 49(12), 1876–1879 (2002)CrossRefGoogle Scholar
  12. 12.
    Ritter, G.X., Urcid, G., Selfridge, R.: Minimax dendrite computation. In: ASME Proc. ANNIE, St. Louis Missouri, USA, vol. 12, pp. 75–80 (2002)Google Scholar
  13. 13.
    Ritter, G.X., Urcid, G.: Lattice algebra approach to single neuron computation. IEEE Trans. on Neural Networks 14(2), 282–295 (2003)CrossRefMathSciNetGoogle Scholar
  14. 14.
    Ritter, G.X., Iancu, L., Urcid, G.: Morphological perceptrons with dendritic structure. In: Proc. FUZZ-IEEE, St. Louis, Missouri, USA, pp. 1296–1301 (2003)Google Scholar
  15. 15.
    Ritter, G.X., Iancu, L.: Lattice algebra approach to neural networks and pattern classification. In: Proc. 6th Open German-Russian Workshop on Pattern Recognition and Image Understanding, Katun Village, Altai Region, Russian Federation, pp. 18–21 (2003)Google Scholar
  16. 16.
    Ritter, G.X., Iancu, L., Urcid, G.: Neurons, dendrites, and pattern recognition. In: Proc. 8th Iberoamerican Congress on Pattern Recognition, Havana, Cuba, pp. 1296–1301 (2003)Google Scholar
  17. 17.
    Ritter, G.X., Iancu, L.: Lattice algebra, dendritic computing, and pattern recognition. In: Invited tutorial, 8th Iberoamerican Congress on Pattern Recognition, Havana, Cuba, pp. 16–24 (2003)Google Scholar
  18. 18.
    Ash, T.: Dynamic node creation in backpropagation networks. Connectionist Science 1, 365–375 (1989)CrossRefGoogle Scholar
  19. 19.
    Setiono, R., Hui, L.C.K.: Use of quasi-Newton method in a feedforward neural network construction algorithm. IEEE Transactions on Neural Networks 6, 273–277 (1995)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Gonzalo Urcid
    • 1
  • Gerhard X. Ritter
    • 2
  • Laurentiu Iancu
    • 2
  1. 1.Optics DepartmentINAOETonantzintlaMexico
  2. 2.CISE DepartmentUniversity of FloridaGainesvilleUSA

Personalised recommendations