A Neural Dynamic Architecture Resolves Phrases about Spatial Relations in Visual Scenes

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8681)


How spatial language, important to both cognitive science and robotics, is mapped to real-world scenes by neural processes is not understood. We present an autonomous neural dynamics that achieves this mapping flexibly. Neural activation fields represent and spatially transform perceptual information. An architecture of dynamic nodes interacts with these perceptual fields to instantiate categorical concepts. Discrete time processing steps emerge from instabilities of the time-continuous neural dynamics and are organized sequentially by these nodes. These steps include the attentional selection of individual objects in a scene, mapping locations to an object-centered reference frame, and evaluating matches to relational spatial terms. The architecture can respond to queries specified by setting the state of discrete nodes. It autonomously generates a response based on visual input about a scene.


spatial language sequence generation autonomy neural dynamics Dynamic Field Theory 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Amari, S.I.: Dynamics of pattern formation in lateral-inhibition type neural fields. Biological Cybernetics 27(2), 77–87 (1977)CrossRefzbMATHMathSciNetGoogle Scholar
  2. 2.
    Barsalou, L.W.: Perceptual symbol systems. Behavioral and Brain Sciences 22(04), 577–660 (1999)Google Scholar
  3. 3.
    Carlson, L.A., Logan, G.D.: Attention and spatial language. In: Itti, L., Rees, G., Tsotsos, J.K. (eds.) Neurobiology of Attention, ch. 54, pp. 330–336. Elsevier Academic Press (2005)Google Scholar
  4. 4.
    Guadarrama, S., Riano, L., Golland, D., Gohring, D., Jia, Y., Klein, D., Abbeel, P., Darrell, T.: Grounding spatial relations for human-robot interaction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (2013)Google Scholar
  5. 5.
    Knauff, M.: Space to reason: A spatial theory of human thought. MIT Press, Cambridge (2013)CrossRefGoogle Scholar
  6. 6.
    Lipinski, J., Schneegans, S., Sandamirskaya, Y., Spencer, J.P., Schöner, G.: A neurobehavioral model of flexible spatial language behaviors. Journal of Experimental Psychology. Learning, Memory, and Cognition 38(6) (2012)Google Scholar
  7. 7.
    Logan, G.D., Sadler, D.D.: A computational analysis of the apprehension of spatial relations. In: Bloom, P., Peterson, M., Nadel, L., Garrett, M. (eds.) Language and Space, ch. 13, pp. 493–529. MIT Press, Cambridge (1996)Google Scholar
  8. 8.
    Lomp, O., Zibner, S.K.U., Richter, M., Rañó, I., Schöner, G.: A software framework for cognition, embodiment, dynamics, and autonomy in robotics: cedar. In: Mladenov, V., Koprinkova-Hristova, P., Palm, G., Villa, A.E.P., Appollini, B., Kasabov, N. (eds.) ICANN 2013. LNCS, vol. 8131, pp. 475–482. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  9. 9.
    Pylyshyn, Z.W.: The imagery debate: Analogue media versus tacit knowledge. Psychological Review 88, 16–45 (1981)CrossRefGoogle Scholar
  10. 10.
    Richter, M., Sandamirskaya, Y., Schöner, G.: A robotic architecture for action selection and behavioral organization inspired by human cognition. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2457–2464 (2012)Google Scholar
  11. 11.
    Sandamirskaya, Y., Schöner, G.: An embodied account of serial order: How instabilities drive sequence generation. Neural Networks 23(10), 1164–1179 (2010)CrossRefGoogle Scholar
  12. 12.
    Schneegans, S., Schöner, G.: Dynamic field theory as a framework for understanding embodied cognition. In: Calvo, P., Gomila, T. (eds.) Handbook of Cognitive Science: An Embodied Approach. Perspectives on Cognitive Science, pp. 241–271. Elsevier (2008)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Institut für NeuroinformatikRuhr-Universität BochumBochumGermany

Personalised recommendations