Skip to main content

Interaction between Speech and Gesture: Strategies for Pointing to Distant Objects

  • Conference paper
Gesture and Sign Language in Human-Computer Interaction and Embodied Communication (GW 2011)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7206))

Included in the following conference series:

Abstract

Referring to objects using multimodal deictic expressions is an important form of communication. This work addresses the question on how pragmatic factors affect content distribution between the modalities speech and gesture. This is done by analyzing a study on deictic pointing gestures to objects under two conditions: with and without speech. The relevant pragmatic factor was the distance to the referent object. As one main result two strategies were identified which were used by participants to adapt their gestures to the condition. This knowledge can be used, e.g., to improve the naturalness of pointing gestures employed by embodied conversational agents.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 49.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kendon, A.: Language and Gesture: Unity or Duality. In: McNeill, D. (ed.) Language and Gesture, pp. 47–63. Cambridge University Press (2000)

    Google Scholar 

  2. Levelt, W., Richardson, G., La Heij, W.: Pointing and Voicing in Deictic Expressions. Journal of Memory and Language 24, 133–164 (1985)

    Article  Google Scholar 

  3. Cassell, J., Stone, M., Douville, B., Prevost, S., Achorn, B., Steedman, M., Badler, N., Pelachaud, C.: Modeling the Interaction between Speech and Gesture. In: Ram, A., Eiselt, K. (eds.) Proceedings of the Sixteenth Annual Conference of the Cognitive Science Society, pp. 153–158. Lawrence Erlbaum Associates (1994)

    Google Scholar 

  4. McNeill, D.: Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press (1992)

    Google Scholar 

  5. Morrel-Samuels, P., Krauss, R.: Word Familiarity Predicts Temporal Asynchrony of Hand Gestures and Speech. Journal of Experimental Psychology: Learning, Memory, and Cognition 18, 615–622 (1992)

    Article  Google Scholar 

  6. Bergmann, K., Kopp, S.: Verbal or Visual? How Information is Distributed Across Speech and Gesture in Spatial Dialog. In: Schlangen, D., Fernandez, R. (eds.) Proceedings of the 10th Workshop on the Semantics and Pragmatics of Dialogue, pp. 90–97. Universitätsverlag, Potsdam (2006)

    Google Scholar 

  7. Kranstedt, A., Wachsmuth, I.: Incremental Generation of Multimodal Deixis Referring to Objects. In: Proceedings of the 10th European Workshop on Natural Language Generation (ENLG 2005), Aberdeen, UK, pp. 75–82 (2005)

    Google Scholar 

  8. Dale, R., Reiter, E.: Computational Interpretations of the Gricean Maxims in the Generation of Referring Expressions. Cognitive Science 18, 233–263 (1995)

    Article  Google Scholar 

  9. Van der Sluis, I.: Multimodal Reference - Studies in Automatic Generation of Multimodal Referring Expressions. PhD Thesis. BuG, Groningen (2005)

    Google Scholar 

  10. Kranstedt, A., Lücking, A., Pfeiffer, T., Rieser, H., Staudacher, M.: Measuring and Reconstructing Pointing in Visual Contexts. In: Schlangen, D., Fernandez, R. (eds.) Proceedings of the Brandial 2006 - The 10th Workshop on the Semantics and Pragmatics of Dialogue, pp. 82–89. Universitätsverlag, Potsdam (2006)

    Google Scholar 

  11. Pfeiffer, T.: Understanding Multimodal Deixis with Gaze and Gesture in Conversational Interfaces. PhD Thesis, Faculty of Technology, Bielefeld University. Shaker Verlag, Aachen (2011)

    Google Scholar 

  12. Bavelas, J., Kenwood, C., Johnson, T., Philips, B.: An Experimental Study of When and How Speakers Use Gestures to Communicate. Gesture 2(1), 1–17 (2002)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pfeiffer, T. (2012). Interaction between Speech and Gesture: Strategies for Pointing to Distant Objects. In: Efthimiou, E., Kouroupetroglou, G., Fotinea, SE. (eds) Gesture and Sign Language in Human-Computer Interaction and Embodied Communication. GW 2011. Lecture Notes in Computer Science(), vol 7206. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34182-3_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-34182-3_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-34181-6

  • Online ISBN: 978-3-642-34182-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics