Advertisement

KI - Künstliche Intelligenz

, Volume 33, Issue 4, pp 319–330 | Cite as

An Introduction to Hyperdimensional Computing for Robotics

  • Peer NeubertEmail author
  • Stefan Schubert
  • Peter Protzel
Technical Contribution
  • 100 Downloads

Abstract

Hyperdimensional computing combines very high-dimensional vector spaces (e.g. 10,000 dimensional) with a set of carefully designed operators to perform symbolic computations with large numerical vectors. The goal is to exploit their representational power and noise robustness for a broad range of computational tasks. Although there are surprising and impressive results in the literature, the application to practical problems in the area of robotics is so far very limited. In this work, we aim at providing an easy to access introduction to the underlying mathematical concepts and describe the existing computational implementations in form of vector symbolic architectures (VSAs). This is accompanied by references to existing applications of VSAs in the literature. To bridge the gap to practical applications, we describe and experimentally demonstrate the application of VSAs to three different robotic tasks: viewpoint invariant object recognition, place recognition and learning of simple reactive behaviors. The paper closes with a discussion of current limitations and open questions.

Keywords

Hyperdimensional computing Vector symbolic architectures Robotics 

References

  1. 1.
    Aggarwal CC, Hinneburg A, Keim DA (2001) On the surprising behavior of distance metrics in high dimensional space. In: Van den Bussche J, Vianu V (eds) Database theory—ICDT 2001. Springer, Berlin Heidelberg, Berlin, Heidelberg, pp 420–434CrossRefGoogle Scholar
  2. 2.
    Ahmad S, Hawkins J (2015) Properties of sparse distributed representations and their application to hierarchical temporal memory. CoRR arxiv:abs/1503.07469
  3. 3.
    Bellman RE (1961) Adaptive Control Processes: A Guided Tour. MIT Press, CambridgeCrossRefGoogle Scholar
  4. 4.
    Beyer K, Goldstein J, Ramakrishnan R, Shaft U (1999) When Is nearest neighbor meaningful? In: Beeri C, Buneman P (eds) Database theory—ICDT’99. Springer, Berlin Heidelberg, Berlin, Heidelberg, pp 217–235CrossRefGoogle Scholar
  5. 5.
    Danihelka I, Wayne G, Uria B, Kalchbrenner N, Graves A (2016) Associative long short-term memory. In: Balcan MF, Weinberger KQ (eds) Proceedings of the 33rd international conference on machine learning, proceedings of machine learning research, vol 48. PMLR, New York, pp 1986–1994. http://proceedings.mlr.press/v48/danihelka16.html Google Scholar
  6. 6.
    Eliasmith C, Stewart TC, Choo X, Bekolay T, DeWolf T, Tang Y, Rasmussen D (2012) A large-scale model of the functioning brain. Science 338(6111):1202–1205.  https://doi.org/10.1126/science.1225266. http://science.sciencemag.org/content/338/6111/1202 CrossRefGoogle Scholar
  7. 7.
    Frady EP, Kleyko D, Sommer FT (2018) A theory of sequence indexing and working memory in recurrent neural networks. Neural Comput 30(6):1449–1513.  https://doi.org/10.1162/neco_a_01084 MathSciNetCrossRefGoogle Scholar
  8. 8.
    Gayler RW (1998) Multiplicative binding, representation operators, and analogy. In: Advances in analogy research: integr of theory and data from the cogn, comp, and neural sciences. BulgariaGoogle Scholar
  9. 9.
    Gayler RW (2003) Vector symbolic architectures answer Jackendoff’s challenges for cognitive neuroscience. In: Proc. of ICCS/ASCS Int. Conf. on cognitive science, pp 133–138. Sydney, AustraliaGoogle Scholar
  10. 10.
    Geusebroek JM, Burghouts GJ, Smeulders AWM (2005) The Amsterdam library of object images. Int J Comput Vis 61(1):103–112CrossRefGoogle Scholar
  11. 11.
    Hastie T, Tibshirani R, Friedman J (2009) The elements of statistical learning: data mining, inference and prediction, 2 edn. Springer. http://www-stat.stanford.edu/~tibs/ElemStatLearn/
  12. 12.
    Hawkins J, Ahmad S (2016) Why neurons have thousands of synapses, a theory of sequence memory in neocortex. Front Neural Circ 10:23.  https://doi.org/10.3389/fncir.2016.00023 CrossRefGoogle Scholar
  13. 13.
    Jackendoff R (2002) Foundations of language (brain, meaning, grammar, evolution). Oxford University Press, OxfordCrossRefGoogle Scholar
  14. 14.
    Joshi A, Halseth JT, Kanerva P (2017) Language geometry using random indexing. In: de Barros JA, Coecke B, Pothos E (eds) Quantum interaction. Springer International Publishing, Cham, pp 265–274CrossRefGoogle Scholar
  15. 15.
    Kanerva P (1997) Fully distributed representation. In: Proc. of real world computing symposium, pp 358–365. Tokyo, JapanGoogle Scholar
  16. 16.
    Kanerva P (2009) Hyperdimensional computing: an introduction to computing in distributed representation with high-dimensional random vectors. Cognit Comput 1(2):139–159CrossRefGoogle Scholar
  17. 17.
    Kanerva P (2014) Computing with 10,000-bit words. In: 2014 52nd annual Allerton conference on communication, control, and computing (Allerton), pp 304–310 .  https://doi.org/10.1109/ALLERTON.2014.7028470
  18. 18.
    Kleyko D, Osipov E, Gayler RW, Khan AI, Dyer AG (2015) Imitation of honey bees’ concept learning processes using Vector Symbolic Architectures. Biol Inspired Cognit Arch 14:57–72.  https://doi.org/10.1016/j.bica.2015.09.002 CrossRefGoogle Scholar
  19. 19.
    Kleyko D, Osipov E, Papakonstantinou N, Vyatkin V, Mousavi A (2015) Fault detection in the hyperspace: towards intelligent automation systems. In: 2015 IEEE 13th international conference on industrial informatics (INDIN), pp 1219–1224.  https://doi.org/10.1109/INDIN.2015.7281909
  20. 20.
    Kleyko D, Rahimi A, Rachkovskij DA, Osipov E, Rabaey JM (2018) Classification and recall with binary hyperdimensional computing: tradeoffs in choice of density and mapping characteristics. IEEE Trans Neural Netw Learn Syst 29(12):5880–5898.  https://doi.org/10.1109/TNNLS.2018.2814400 CrossRefGoogle Scholar
  21. 21.
    Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: Pereira F, Burges C, Bottou L, Weinberger K (eds) Advances in neural information processing systems, vol 25. Curran Associates, Inc., pp 1097–1105. http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf
  22. 22.
    Levy SD, Bajracharya S, Gayler RW (2013) Learning behavior hierarchies via high-dimensional sensor projection. In: Proc. of AAAI conference on learning rich representations from low-level sensors, AAAIWS’13–12, pp 25–27Google Scholar
  23. 23.
    Milford M, Wyeth GF (2012) SeqSLAM: visual route-based navigation for sunny summer days and stormy winter nights. In: Proceedings of the IEEE international conference on robotics and automation (ICRA)Google Scholar
  24. 24.
    Neubert P, Ahmad S, Protzel P (2018) A sequence-based neuronal model for mobile robot localization. In: Proc of KI: advances in artificial intelligenceGoogle Scholar
  25. 25.
    Neubert P, Protzel P (2015) Neubert P, Protzel P (2015) Local region detector+ CNN based landmarks for practical place recognition in changing environments. In: Proceedings of the European conference on mobile robotics (ECMR)Google Scholar
  26. 26.
    Neubert P, Schubert S, Protzel P (2016) Learning vector symbolic architectures for reactive robot behaviours. In: Proc of Intl Conf on intelligent robots and systems (IROS) workshop on machine learning methods for high-level cognitive capabilities in roboticsGoogle Scholar
  27. 27.
    Osipov E, Kleyko D, Legalov A (2017) Associative synthesis of finite state automata model of a controlled object with hyperdimensional computing. In: IECON 2017—43rd annual conference of the IEEE industrial electronics society, pp 3276–3281.  https://doi.org/10.1109/IECON.2017.8216554
  28. 28.
    Plate TA (1994) Distributed representations and nested compositional structure. Ph.D. thesis, Toronto, Ont., Canada, CanadaGoogle Scholar
  29. 29.
    Purdy S (2016) Encoding data for HTM systems. CoRR arxiv:abs/1602.05925
  30. 30.
    Rachkovskij DA, Slipchenko SV (2012) Similarity-based retrieval with structure-sensitive sparse binary distributed representations. Comput Intell 28(1):106–129.  https://doi.org/10.1111/j.1467-8640.2011.00423.x MathSciNetCrossRefGoogle Scholar
  31. 31.
    Rahimi A, Datta S, Kleyko D, Frady EP, Olshausen B, Kanerva P, Rabaey JM (2017) High-dimensional computing as a nanoscalable paradigm. IEEE Trans Circ Syst I Regular Pap 64(9):2508–2521.  https://doi.org/10.1109/TCSI.2017.2705051 CrossRefGoogle Scholar
  32. 32.
    Smolensky P (1990) Tensor product variable binding and the representation of symbolic structures in connectionist systems. Artif Intell 46(1–2):159–216MathSciNetCrossRefGoogle Scholar
  33. 33.
    Sünderhauf N, Dayoub F, Shirazi S, Upcroft B, Milford M (2015) On the performance of ConvNet features for place recognition. CoRR arxiv:abs/1501.04158
  34. 34.
    Sünderhauf N, Neubert P, Protzel P (2013) Are we there yet? Challenging SeqSLAM on a 3000 km journey across all four seasons. In: Proceedings of the IEEE international conference on robotics and automation (ICRA), workshop on long-term autonomyGoogle Scholar
  35. 35.
    Sünderhauf N, Brock O, Scheirer W, Hadsell R, Fox D, Leitner J, Upcroft B, Abbeel P, Burgard W, Milford M, Corke P (2018) The limits and potentials of deep learning for robotics. Int J Robot Res 37(4–5):405–420.  https://doi.org/10.1177/0278364918770733 CrossRefGoogle Scholar
  36. 36.
    Thrun S, Burgard W, Fox D (2005) Probabilistic robotics (intelligent robotics and autonomous agents). The MIT Press, CambridgezbMATHGoogle Scholar
  37. 37.
    Widdows D, Cohen T (2015) Reasoning with vectors: a continuous model for fast robust inference. Logic J IGPL/Interest Group Pure Appl Logics 2:141–173MathSciNetGoogle Scholar
  38. 38.
    Yerxa T, Anderson A, Weiss E (2018) The hyperdimensional stack machine. In: Proceedings of Cognitive Computing, Hannover, pp. 1–2Google Scholar

Copyright information

© Gesellschaft für Informatik e.V. and Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Chemnitz University of TechnologyChemnitzGermany

Personalised recommendations