Abstract
The 1990s saw the emergence of cognitive models that depend on very high dimensionality and randomness. They include Holographic Reduced Representations, Spatter Code, Semantic Vectors, Latent Semantic Analysis, Context-Dependent Thinning, and Vector-Symbolic Architecture. They represent things in high-dimensional vectors that are manipulated by operations that produce new high-dimensional vectors in the style of traditional computing, in what is called here hyperdimensional computing on account of the very high dimensionality. The paper presents the main ideas behind these models, written as a tutorial essay in hopes of making the ideas accessible and even provocative. A sketch of how we have arrived at these models, with references and pointers to further reading, is given at the end. The thesis of the paper is that hyperdimensional representation has much to offer to students of cognitive science, theoretical neuroscience, computer science and engineering, and mathematics.
Similar content being viewed by others
References
Anderson JA. A simple neural network generating an interactive memory. Math Biosci. 1972;14:197–220.
Kohonen T. Correlation matrix memories. IEEE Trans Comput. 1984;C21(4):353–9.
Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci USA. 1982;79(8):2554–8.
Kanerva P. Sparse distributed memory. Cambridge, MA: MIT Press; 1988.
Karlsson R. A fast activation mechanism for the Kanerva SDM memory. In: Uesaka Y, Kanerva P, Asoh H, editors. Foundations of real-world computing. Stanford: CSLI; 2001. p. 289–93.
Aleksander I, Stonham TJ, Wilkie BA. Computer vision systems for industry: WISARD and the like. Digit Syst Ind Autom. 1982;1:305–23.
Hinton GH, Anderson JA, editors. Parallel models of associative memory. Hillsdale, NJ: Erlbaum; 1981.
Hassoun MH, editor. Associative neural memories: theory and implementation. New York, Oxford: Oxford University Press; 1993.
Kohonen T. Self-organization and associative memory. 3rd ed. Berlin: Springer; 1989.
Palm G. Neural assemblies: an alternative approach to artificial intelligence. Heidelberg: Springer; 1982.
Hinton GE. Mapping part–whole hierarchies into connectionist networks. Artif Intell. 1990;46(1–2):47–75.
Smolensky P. Tensor product variable binding and the representation of symbolic structures in connectionist networks. Artif Intell. 1990;46(1–2):159–216.
Plate T. Holographic Reduced Representations: convolution algebra for compositional distributed representations. In: Mylopoulos J, Reiter R, editors. Proc. 12th int’l joint conference on artificial intelligence (IJCAI). San Mateo, CA: Kaufmann; 1991. p. 30–35.
Plate TA. Holographic reduced representation: distributed representation of cognitive structure. Stanford: CSLI; 2003.
Kanerva P. Binary spatter-coding of ordered K-tuples. In: von der Malsburg C, von Seelen W, Vorbruggen JC, Sendhoff B, editors. Artificial neural networks – ICANN 96 proceedings (Lecture notes in computer science, vol. 1112). Berlin: Springer; 1996. p. 869–73.
Gayler RW. Multiplicative binding, representation operators, and analogy. Poster abstract. In: Holyoak K, Gentner D, Kokinov B, editors. Advances in analogy research. Sofia: New Bulgarian University; 1998. p. 405. Full poster http://cogprints.org/502/. Accessed 15 Nov 2008.
Rachkovskij DA, Kussul EM. Binding and normalization of binary sparse distributed representations by context-dependent thinning. Neural Comput. 2001;13(2):411–52.
Kussul EM, Baidyk TN. On Information encoding in associative–projective neural networks. Report 93-3. Kiev, Ukraine: V.M. Glushkov Inst. of Cybernetics; 1993 (in Russian).
Landauer T, Dumais S. A solution to Plato’s problem: the Latent Semantic Analysis theory of acquisition, induction and representation of knowledge. Psychol Rev. 1997;104(2):211–40.
Kanerva P, Kristoferson J, Holst A. Random Indexing of text samples for latent semantic analysis. Poster abstract. In: Gleitman LR, Josh AK, editors. Proc. 22nd annual conference of the Cognitive Science Society. Mahwah, NJ: Erlbaum; 2000. p. 1036. Full poster http://www.rni.org/kanerva/cogsci2k-poster.txt. Accessed 23 Nov 2008.
Papadimitriou C, Raghavan P, Tamaki H, Vempala S. Latent semantic indexing: a probabilistic analysis. Proc. 17th ACM symposium on the principles of database systems. New York: ACM Press; 1998. p. 159–68.
Kaski S. Dimensionality reduction by random mapping: fast similarity computation for clustering. Proc. int’l joint conference on neural networks, IJCNN’98. Piscataway, NJ: IEEE Service Center; 1999. p. 413–8.
Indyk P. Algorithmic aspects of low-distortion geometric embeddings. Annual symposium on foundations of computer science (FOCS) 2001 tutorial. http://people.csail.mit.edu/indyk/tut.ps. Accessed 15 Nov 2008.
Schütze H. Word space. In: Hanson SJ, Cowan JD, Giles CL, editors. Advances in neural information processing systems 5. San Mateo, CA: Kaufmann; 1993. p. 895–902.
Lund K, Burgess C, Atchley R. Semantic and associative priming in high-dimensional semantic space. Proc. 17th annual conference of the Cognitive Science Society. Mahwah, NJ: Erlbaum; 1995. p. 660–5.
Sahlgren M. The word-space model. Doctoral dissertation. Department of Linguistics, Stockholm University; 2006. http://www.sics.se/∼mange/TheWordSpaceModel.pdf. Accessed 23 Nov 2008.
Jones MN, Mewhort DJK. Representing word meaning and order information in a composite holographic lexicon. Psychol Rev. 2007;114(1):1–37.
Sahlgren M, Holst A, Kanerva P. Permutations as a means to encode order in word space. Proc. 30th annual conference of the Cognitive Science Society. Austin, TX: Cognitive Science Society. p. 1300–5.
Widdows D. Geometry and meaning. Stanford: CSLI; 2004.
Acknowledgements
Real Wold Computing Project funding by Japan’s Ministry of International Trade and Industry to the Swedish Institute of Computer Science in 1994–2001 made it possible for us to develop the ideas for high-dimensional binary representation. The support of Dr. Nobuyuki Otsu throughout the project was most valuable. Dr. Dmitri Rachkovskij provided information on early use of permutations to encode sequences by researchers in Ukraine. Dikran Karagueuzian of CSLI Publications accepted for publication Plate’s book on Holographic Reduced Representation after a publishing agreement elsewhere fell through. Discussions with Tony Plate and Ross Gayler have helped shape the ideas and their presentation here. Sincere thanks to you all, as well as to my coauthors on papers on representation and to three anonymous reviewers of the manuscript.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Kanerva, P. Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors. Cogn Comput 1, 139–159 (2009). https://doi.org/10.1007/s12559-009-9009-8
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-009-9009-8