Abstract
The Spatter Code is a high-dimensional (e.g., N=10,000), random code that encodes “high-level concepts” in tenns of their “low-level attributes” so that concepts at different levels can be mixed freely. The binary spatter code is the simplest. It has two N-bit codewords for each concept or item, a “high-level,” or dense, word with many randomly placed Is and a “low-level,” or sparse, word with a few (that are contained in the many). The dense codewords can be used as inputs to an associative memory. The sparse codewords are used in encoding new concepts. When several items (attributes, concepts, chunks) are combined to form a new item, the two codewords for the new item are made from the sparse codewords of its constituents as follows: the new dense word is the logical OR of the constiblents (i.e., their sum thresholded at 0.5), and the new sparse word has Is where the constiblent words overlap (i.e., their sum thresholded at 1.5). When the parameters for the code are chosen properly, the number of Is in the codewords is maintained as new items are encoded from combinations of old ones.
This is a preview of subscription content, access via your institution.
Buying options
Preview
Unable to display preview. Download preview PDF.
References
Albus, I.S. (1991) Outline for a theory of intelligence. IEEE Trans. Systems, Man, and Cybernetics 31(3):473–509.
Hassoun, M.H., ed. (1993) Associative Neural Memories: Theory and Implemenration. New York: Oxford University Press.
Jaeckel, L.A. (1989) Some Methods of Encoding Simple Visual Images for Use with a Sparse Distributed Memory, with Application to Character Recognition. Report RIACS 1R 89.29, Research Institute for Advanced Computer Science, NASAAmes Research Center.
Kohonen, T. (1988) Self-Organization and Associative Memory, 2nd ed. Berlin: Springer-Verlag.
Manevitz, L.M., and Zemach, Y. (1991) Assigning meaning to data: Multi-level information processing in Kanerva’s SDM. Proc. 1991 Israel Conference on AI and Computer Vision (IAICV 8); 114–130.
Miller, G.A. (1956) The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review 63:71–97.
Smith, DJ., and Stanford, P.H. (1990) A random walk in Hamming space. Proc. 1990 Int’l Joint Conference on Neural Networks (IJCNN 90), vol. 2; 465–470.
Stanford, P.H., and Smith, DJ. (1993 ms.) The Multidimensional Scatter Code: A Data Fusion Technique with Exponential Capacity. Submitted to ICANN 94.
Willshaw, D.J., and von der Malsburg, Ch. (1975) How patterned neural connexions can be set up by self-organisation. Proc. Royal Society B., 194 :431–445.
Willshaw, OJ., and von der Malsburg, Ch. (1979) A marker induction mechanism for the establishment of ordered neural mappings: its application to the retinotectal problem. Phil. Trans. Royal Society B., 287 :203–243.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1994 Springer-Verlag London Limited
About this paper
Cite this paper
Kanerva, P. (1994). The Spatter Code for Encoding Concepts at Many Levels. In: Marinaro, M., Morasso, P.G. (eds) ICANN ’94. ICANN 1994. Springer, London. https://doi.org/10.1007/978-1-4471-2097-1_52
Download citation
DOI: https://doi.org/10.1007/978-1-4471-2097-1_52
Published:
Publisher Name: Springer, London
Print ISBN: 978-3-540-19887-1
Online ISBN: 978-1-4471-2097-1
eBook Packages: Springer Book Archive