AI & SOCIETY

, Volume 23, Issue 2, pp 187–200 | Cite as

Real-time system for measuring gaze direction and facial features: towards automatic discrimination of lies using diverse nonverbal information

Original Article

Abstract

Interactive and autonomous agents might be common in everyday life in the future; we expect that such agents will have the ability to communicate with people naturally. For natural communication, the agents should speculate about the intentions of the people they interact with. To enable agents to speculate about intentions like deception, we focused on unconscious expressions when people tell a lie. However, there is no system that can meet the necessary conditions for measuring nonverbal information in natural communication. Therefore, we made a real-time system for measuring gaze direction and facial features. We conducted experiments for discriminating lies by using the system in a situation similar to actual communication. As a result, we found that we could discriminate lies by using diverse nonverbal information in the same way people did.

Keywords

Discriminant Function Facial Feature Discrimination Ratio Pitch Variable Dark Pixel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

This research was partially supported by the Ministry of Education, Culture, Sports, Science and Technology through a Grant-in-Aid for Creative Scientific Research (No. 13GS0003) and Scientific Research on Priority Areas (No. 04560012).

References

  1. Burgoon JK (1994) Nonverbal signals. In: Knapp ML, Miller GR (eds) Handbook of interpersonal communication, 2nd edn. Sage, Newbury ParkGoogle Scholar
  2. Coleman L, Kay P (1981) Prototype semantics: the English word lie. Language 57(1):26–44CrossRefGoogle Scholar
  3. Daibou I (2001) The social meaning of interpersonal communication. Jpn J Interpers Soc Psychol 1:1–16Google Scholar
  4. DePaulo BM, Lindsay JJ, Malone BE, Muhlenbruck L, Charlton K, Cooper H (2003) Cues to deception. Psychol Bull 129(1):74–118CrossRefGoogle Scholar
  5. Fukuda K (2000) Eyeblinks: new indices for the detection of deception. Int J Psychophysiol 40:239–245CrossRefGoogle Scholar
  6. Hayashi Y (1998) F0 contour and recognition of vocal expression of feelings: using the interjectory word “eh”. Technical report of IEICE Speech 98(177):65–72Google Scholar
  7. Kawato S, Tetsutani N (2004a) Scale adaptive face detection and tracking in real time with SSR filter and support vector machine. Proceedings of ACCV 2004 1:132–137Google Scholar
  8. Kawato S, Tetsutani N (2004b) Detection and tracking of eyes for gaze-camera control. Image Vis Comput 22(12):1031–1038CrossRefGoogle Scholar
  9. Matsumoto Y, Zelinsky A (2000) An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement. Proceedings of IEEE fourth international conference on face and gesture recognition:499–505Google Scholar
  10. Miller GR, Stiff JB (1993) Deceptive communication. Sage, Newbury ParkGoogle Scholar
  11. Nakamura T (2000) Analysis of time differences in expressions of spontaneous and forced laughter. Technical report of IEICE, IE 100(34):1–8Google Scholar
  12. Ohmoto Y, Ueda K, Komatsu T (2005) Reading of intentions that appear as diverse nonverbal information in face-to-face communication. Proceedings of the symposium on conversational informatics for supporting social intelligence and interactions:52–57Google Scholar
  13. Ohno T, Mukawa N, Yoshikawa A (2002) FreeGaze: a gaze tracking system for everyday gaze interaction. Proceedings of the symposium on ETRA 2002: eye tracking research and applications symposium:125–132Google Scholar
  14. Oka K, Sato Y, Nakanishi Y, Koike H (2005) Head pose estimation system based on particle filtering with adaptive diffusion control. The transactions of the Institute of Electronics, Information and Communication Engineers. D-II 88(8):1601–1613Google Scholar
  15. Von Raffler-Engel W (1980) Aspects of nonverbal communication. Swets & Zeitlinger, LisseGoogle Scholar
  16. Yanagida Y, Kawato S, Noma H, Tetsutani N, Tomono A (2003) A nose-tracked, personal olfactory display. ACM SIGGRAPH 2003 sketches and applications:50–54Google Scholar

Copyright information

© Springer-Verlag London Limited 2007

Authors and Affiliations

  • Yoshimasa Ohmoto
    • 1
  • Kazuhiro Ueda
    • 1
  • Takehiko Ohno
    • 2
  1. 1.Department of General System Studies, Graduate School of Arts and ScienceThe University of TokyoTokyoJapan
  2. 2.NTT Department IINTT CorporationTokyoJapan

Personalised recommendations