Advertisement

Computers and the Humanities

, Volume 25, Issue 2–3, pp 129–140 | Cite as

Integration of communicative hand movements into human-computer-interaction

  • Dagmar Schmauks
  • Michael Wille
Article

Abstract

During face-to-face communication, the dialog partners can see and hear each other. Each speaker produces a variety of phenomena parallel to speech. Some of them, e.g. intonation, are coded vocally, others are coded by motor responses (facial expression, gestures, etc.). If human-computer-interaction (HCI) tries to mimic this situation, at least some non-verbal phenomena have to be integrated into natural language input and output. A multitude of new devices (mouse, joystick, touch-screens, etc.) have enabled this transition to multimodal HCI. Gestures which illustrate the content of the verbal message are especially suitable for integration into HCI. A relevant subset of them is pointing gestures, which specify elements of the visual context. They are performed frequently because their use shortens and simplifies the verbal output. As an illustration of these considerations, the NL dialog system XTRA (University of Saarbrücken) is presented. It allows reference to elements of a tax form by the combination of textual input and simulated pointing gestures. In order to explore the regularities of this “form deixis,” an experiment has been carried out within the framework of the XTRA-project. Furthermore, its results were taken for an evaluation of the currently used simulation technique.

Michael Wille is a researcher in the AI laboratory at the University of Saarbrücken. He has studied computer science, economics and cognitive psychology. he has worked on expert systems for SIEMENS (hardware diagnosis). His master's thesis (1989) was called “Evaluation and Extension of a Module for the Simulation and Analysis of Pointing Gestures.” His main research interest is multimedia interaction.

Key Words

natural language interfaces multimodal HGI referent specification deixis gestures referent identification 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Allgayer, J. “Eine Graphikkomponente zur Integration von Zeigehandlungen in natürlichsprachliche KI-Systeme.” Processings der 16. GI-Jahrestagung. Berlin: Springer, 1986, pp. 284–98.Google Scholar
  2. Allgayer, J., K. Harbusch, A. Kobsa, C. Reddig, N. Reithinger, D. Schmauks. “XTRA: A Natural-Language Access System to Expert Systems.” International Journal of Man-Machine Studies (1989). To Appear.Google Scholar
  3. Buxton, W. and B. A. Myers. “A Study in Two-Handed Input.” Proceedings CHI 86 Human Factors in Computing Systems. New York: ACM. 1986, pp. 321–26.Google Scholar
  4. Ekman, P. and W. V. Friesen. “The Repertoire of Nonverbal Behavior: Categories, Origins, and Coding.” Semiotica, 1, (1969), pp. 49–98.Google Scholar
  5. Hinrichs, E. and L. Polanyi. “Pointing The Way: A Unified Treatment Of Referential Gesture In Interactive Discourse.” Paper from Parasession on Pragmatics and Grammatical Theory at the 22nd Regional Meeting, Chicago Linguistic Society, Chicago, 1987, pp. 298–314.Google Scholar
  6. Kobsa, A., J. Allgayer, C. Reddig, N. Reithinger, D. Schmauks, K. Harbusch and W. Wahlster. “Combining Deictic Gestures and Natural Language for Referent Identification.” Proceedings of the 11th International Conference on Computational Linguistics, Bonn, Germany, 1986, pp. 356–61.Google Scholar
  7. McKeown, K. R.Text Generation. Cambridge University Press, 1985.Google Scholar
  8. Minsky, M. “Manipulating Simulated Objects with Real-World Gestures Using a Force and Position Sensitive Screen.” Computer Graphics 18, 3 (1984) 195–203.Google Scholar
  9. Neal, J. G. and S. C. Shapiro. “Intelligent Multi-Media Interface Technology.” In Architectures for Intelligent Interfaces: Elements and Prototypes. Ed. Sullivan, J. W. and S. W. Tyler. Reading: Addison Wesley, 1988.Google Scholar
  10. Pickering, J. A. “Touch-sensitive Screens: The Technologies and their Application.” International Journal of ManMachine Studies, 25 (1986), 249–69.Google Scholar
  11. Reithinger, N. “Generating Referring Expressions and Pointing Gestures.” In Natural Language Generation. Ed. G. Kempen. Dordrecht: Kluwer, 1987a, pp. 71–81.Google Scholar
  12. Reithinger, N. “Ein erster Blick auf POPEL — Wie wird was gesagt?” In Proceedings of GWAI-87. Ed. K. Morik. Berlin: Springer, pp. 315–19.Google Scholar
  13. Reithinger, N. “POPEL — A Parallel and Incremental Natural Language Generation System.” Paper presented at the 4th International Workshop on Natural Language Generation, Santa Catalina Island, 1988.Google Scholar
  14. Scherer, K. R. “Die Funktionen des nonverbalen Verhaltens im Gespräch.” In Nonverbale Kommunikation. Ed. Scherer, K. R. and H. G. Wallbott, Hrsg. Weinheim: Beltz, 1979.Google Scholar
  15. Schmauks, D. “Formulardeixis and ihre Simulation auf dem Bildschirm. Ein Überblick aus linguistischer Sicht.” Memo Nr. 4, SFB 314, FB Informatik, Universität Saarbrücken. Also in Conceptus, 55 (1988),83–102.Google Scholar
  16. Schmauks, D. “Form and Funktion von Zeigegesten. Ein interdisziplinärer Uberblick.” Bericht Nr. 10, SFB 314, FB Informatik, Universität Saarbrücken, 1986b.Google Scholar
  17. Schmauks, D. “Natural and Simulated Pointing.” Proceedings of the 3rd European ACL Conference, Kopenhagen, Danmark, pp. 179–85. Also: Bericht Nr. 16, SFB 314, FB Informatik, Universität Saarbrücken.Google Scholar
  18. Schmauks, D. and N. Reithinger. “Generating Multimodal Output — Conditions, Advantages and Problems.” Proceedings of the 12th International Conference on Computational Linguistics, Budapest, Hungary, 1988, pp. 584–88. Also: Bericht Nr. 29, SFB 314, FB Informatik, Universität Saarbrücken.Google Scholar
  19. Wahlster, W. “User and Discourse Models for Multimodal Communication.” In Architectures for Intelligent Interfaces: Elements and Prototypes. Ed. Sullivan, J. W. and S. W. Tyler. Reading: Addison Wesley, 1988.Google Scholar
  20. Wetzel, R. P., K. H. Hanne and J. P. Hoepelmann. “DIS-QUE: Deictic Interaction System-Query Environment.” LOKI Report KR-GR 5.3/KR-NL 5, FhG, IAO, Stuttgart, 1987.Google Scholar
  21. Wille, M. Evaluation and Ausbau einer Analysekomponente für Zeigegesten. Diplomarbeit, FB Informatik, Universität des Saarlandes, 1989.Google Scholar
  22. Zimmerman, T. G., J. Lanier, C. Blanchard, S. Bryson and Y. Harvill. “A Hand Gesture Interface Device.” Proceedings CHT'87 Human Factors in Computing Systems. New York: ACM, 1987, pp. 189–92.Google Scholar

Copyright information

© Kluwer Academic Publishers 1991

Authors and Affiliations

  • Dagmar Schmauks
    • 1
  • Michael Wille
    • 1
  1. 1.Sonderforschungsbereich 314, FB 14 - InformatikUniversität des SaarlandesSaarbrücken 11Germany

Personalised recommendations