Abstract
We propose a computational model for building a tactile body schema for a virtual human. The learned body structure of the agent can enable it to acquire a perception of the space surrounding its body, namely its peripersonal space. The model uses tactile and proprioceptive informations and relies on an algorithm which was originally applied with visual and proprioceptive sensor data. In order to feed the model, we present work on obtaining the nessessary sensory data only from touch sensors and the motor system. Based on this, we explain the learning process for a tactile body schema. As there is not only a technical motivation for devising such a model but also an application of peripersonal action space, an interaction example with a conversational agent is described.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Conde, T., Thalmann, D.: An integrated perception for autonomous virtual agents: active and predictive perception. Journal of Visualization and Computer Animation 17(3-4), 457–468 (2006)
Fuke, S., Ogion, M., Asada, M.: Body image constructed from motor and tactile image constructed from motor and tactile images with visual information. International Journal of Humanoid Robotics (IJHR) 4(2), 347–364 (2007)
Gallagher, S.: How the body shapes the mind. Clarendon Press, Oxford (2005)
Goerick, C., Wersing, H., Mikhailova, I., Dunn, M.: Peripersonal space and object recognition for humanoids. In: Proceedings of the IEEE/RSJ International Conference on Humanoid Robots (Humanoids 2005), Tsukuba, Japan, pp. 387–392. IEEE Press, Los Alamitos (2005)
Hersch, M., Sauser, E., Billard, A.: Online learning of the body schema. International Journal of Humanoid Robotics 5(2), 161–181 (2008)
Holmes, N.P., Spence, C.: The body schema and multisensory representation(s) of peripersonal space. Cognitive Processing 5(2), 94–105 (2004)
Huang, Z., Eliëns, A., Visser, C.: “Is it within my reach?” – an agents perspective. In: Rist, T., Aylett, R.S., Ballin, D., Rickel, J. (eds.) IVA 2003. LNCS (LNAI), vol. 2792, pp. 150–158. Springer, Heidelberg (2003)
Kopp, S., Wachsmuth, I.: Synthesizing multimodal utterances for conversational agents. Comput. Animat. Virtual Worlds 15(1), 39–52 (2004)
Magnenat-Thalmann, N., Thalmann, D.: Virtual humans: thirty years of research, what next? The Visual Computer 21(12) (2005)
Maravita, A., Iriki, A.: Tools for the body (schema). Trends in Cognitive Sciences 8(2), 79–86 (2004)
Nguyen, N., Wachsmuth, I., Kopp, S.: Touch perception and emotional appraisal for a virtual agent. In: Proceedings Workshop Emotion and Computing – Current Research and Future Impact, KI-2007, Osnabrück (2007)
Yoshikawa, Y., Yoshimura, M., Hosoda, K., Asada, M.: Visio-tactile binding through double-touching by a robot with an anthropomorphic tactile sensor. In: International Conference on Development and Learning, p. 126 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nguyen, N., Wachsmuth, I. (2009). Modeling Peripersonal Action Space for Virtual Humans Using Touch and Proprioception. In: Ruttkay, Z., Kipp, M., Nijholt, A., Vilhjálmsson, H.H. (eds) Intelligent Virtual Agents. IVA 2009. Lecture Notes in Computer Science(), vol 5773. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04380-2_11
Download citation
DOI: https://doi.org/10.1007/978-3-642-04380-2_11
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04379-6
Online ISBN: 978-3-642-04380-2
eBook Packages: Computer ScienceComputer Science (R0)