Soft Computing

, Volume 12, Issue 3, pp 215–222 | Cite as

Solving the XOR and parity N problems using a single universal binary neuron

Focus

Abstract

A universal binary neuron (UBN) operates with complex-valued weights and a complex-valued activation function, which is the function of the argument of the weighted sum. The activation function of the UBN separates a whole complex plane onto equal sectors, where the activation function is equal to either 1 or −1 depending on the sector parity (even or odd, respectively). Thus, the UBN output is determined by the argument of the weighted sum. This makes it possible the implementation of the nonlinearly separable (non-threshold) Boolean functions on a single neuron. Hence, the functionality of UBN is incompatibly higher than the functionality of the traditional perceptron. In this paper, we will consider a new modified learning algorithm for the UBN. We will show that classical nonlinearly separable problems XOR and Parity n can be easily solved using a single UBN, without any network. Finally, it will be considered how some other important nonlinearly separable problems may be solved using a single UBN.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aizenberg IN (1985) Model of the element with complete functionality. Izvestia Akademii Nauk SSSR, Technicheskaia Kibernetika (J Comput Syst Sci Int) (2):188–191 (in Russian)Google Scholar
  2. Aizenberg IN (1991) The universal logical element over the field of the complex numbers, Kibernetika (Cybern Syst Anal), (3):116–121 (in Russian)Google Scholar
  3. Aizenberg I, Moraga C (2007) Multilayer feedforward neural network based on multi-valued neurons (MLMVN) and a backpropagation learning algorithm. Soft Comput 11(2):169–183CrossRefGoogle Scholar
  4. Aizenberg I, Aizenberg N, Vandewalle J (2000) Multi-valued and universal binary neurons: theory, learning, applications. Kluwer, Boston, Dordrecht, LondonGoogle Scholar
  5. Aizenberg NN, Aizenberg IN (1991) Model of the neural network basic elements (Cells) with universal functionality and various hardware implementations. In: Proceedings of the 2nd international conference “Microelectronics for Neural Networks”, Kyrill & Methody Verlag, Munich, pp 77–83Google Scholar
  6. Aizenberg NN, Aizenberg IN (1992) CNN based on multi-valued neuron as a model of associative memory for gray-scale images. In: Proceedings of the 2nd IEEE international workshop on cellular neural networks and their applications, Technical University Munich, Germany, 14–16 October 1992, pp 36–41Google Scholar
  7. Aizenberg NN, Aizenberg IN (1993) Quickly converging learning algorithms for multi-level and universal binary neurons and solving of the some image processing problems. In: Mira J, Cabestany J, Prieto A (eds). Lecture notes in computer science, vol 686. Springer, Berlin, pp 230–236Google Scholar
  8. Aizenberg NN, Ivaskiv Yu L (1977) Multiple-valued threshold logic. Naukova Dumka Publisher House, Kiev (in Russian)Google Scholar
  9. Aizenberg NN, Ivaskiv Yu L, Pospelov DA (1971) About one generalization of the threshold function Doklady Akademii Nauk SSSR (The Reports of the Academy of Sciences of the USSR), 196:1287–1290 (in Russian)Google Scholar
  10. Dertouzos ML (1965) Threshold logic: a synthesis approach. The MIT Press, CambridgeGoogle Scholar
  11. Fung H, Li LK (2001) Minimal feedforward parity networks using threshold gates. Neural Comput 13:319–326MATHCrossRefGoogle Scholar
  12. Haykin S (1999) Neural networks: a comprehensive foundation, 2nd edn. Prentice Hall, Englewood CliffsMATHGoogle Scholar
  13. Hebb DD (1949) The organization of behavior. Wiley, New YorkGoogle Scholar
  14. Minsky LM, Papert AS (1968) Perceptrons: an introduction to computational geometry, expanded edition. MIT Press, CambridgeGoogle Scholar
  15. Mizutani E, Dreyfus SE, Jang J-SR (2000) On dynamic programming-like recursive gradient formula for alleviating hidden-node saturation in the parity problem. In: Proceedings of the international workshop on intelligent systems resolutions—the 8th Bellman Continuum, Hsinchu, Taiwan, pp 100–104Google Scholar
  16. Rosenblat F (1962) Principles of neurodynamics: perceptrons and the theory of brain mechanisms. Spartan, New YorkGoogle Scholar
  17. Suzuki K, Horiba I, Sugie N (2003) Neural edge enhancer for supervised edge enhancement from noisy images. IEEE Trans Pattern Anal Mach Intell 25(12):1582–1596CrossRefGoogle Scholar
  18. Wang H-M, Zhao J-Y, Guo S-D, Yu D-H (2002) A new kind of shadow detector based on CNN-UBN. In: Proceedings of the 1st IEEE international conference on machine learning and cybernetics, Beijing, 4–5 November 2002, pp 69–72Google Scholar

Copyright information

© Springer-Verlag 2007

Authors and Affiliations

  1. 1.Department of Computer and Information SciencesTexas A&M University-TexarkanaTexarkanaUSA

Personalised recommendations