Skip to main content
Log in

Real-time system for measuring gaze direction and facial features: towards automatic discrimination of lies using diverse nonverbal information

  • Original Article
  • Published:
AI & SOCIETY Aims and scope Submit manuscript

Abstract

Interactive and autonomous agents might be common in everyday life in the future; we expect that such agents will have the ability to communicate with people naturally. For natural communication, the agents should speculate about the intentions of the people they interact with. To enable agents to speculate about intentions like deception, we focused on unconscious expressions when people tell a lie. However, there is no system that can meet the necessary conditions for measuring nonverbal information in natural communication. Therefore, we made a real-time system for measuring gaze direction and facial features. We conducted experiments for discriminating lies by using the system in a situation similar to actual communication. As a result, we found that we could discriminate lies by using diverse nonverbal information in the same way people did.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Burgoon JK (1994) Nonverbal signals. In: Knapp ML, Miller GR (eds) Handbook of interpersonal communication, 2nd edn. Sage, Newbury Park

    Google Scholar 

  • Coleman L, Kay P (1981) Prototype semantics: the English word lie. Language 57(1):26–44

    Article  Google Scholar 

  • Daibou I (2001) The social meaning of interpersonal communication. Jpn J Interpers Soc Psychol 1:1–16

    Google Scholar 

  • DePaulo BM, Lindsay JJ, Malone BE, Muhlenbruck L, Charlton K, Cooper H (2003) Cues to deception. Psychol Bull 129(1):74–118

    Article  Google Scholar 

  • Fukuda K (2000) Eyeblinks: new indices for the detection of deception. Int J Psychophysiol 40:239–245

    Article  Google Scholar 

  • Hayashi Y (1998) F0 contour and recognition of vocal expression of feelings: using the interjectory word “eh”. Technical report of IEICE Speech 98(177):65–72

    Google Scholar 

  • Kawato S, Tetsutani N (2004a) Scale adaptive face detection and tracking in real time with SSR filter and support vector machine. Proceedings of ACCV 2004 1:132–137

  • Kawato S, Tetsutani N (2004b) Detection and tracking of eyes for gaze-camera control. Image Vis Comput 22(12):1031–1038

    Article  Google Scholar 

  • Matsumoto Y, Zelinsky A (2000) An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement. Proceedings of IEEE fourth international conference on face and gesture recognition:499–505

  • Miller GR, Stiff JB (1993) Deceptive communication. Sage, Newbury Park

    Google Scholar 

  • Nakamura T (2000) Analysis of time differences in expressions of spontaneous and forced laughter. Technical report of IEICE, IE 100(34):1–8

    Google Scholar 

  • Ohmoto Y, Ueda K, Komatsu T (2005) Reading of intentions that appear as diverse nonverbal information in face-to-face communication. Proceedings of the symposium on conversational informatics for supporting social intelligence and interactions:52–57

  • Ohno T, Mukawa N, Yoshikawa A (2002) FreeGaze: a gaze tracking system for everyday gaze interaction. Proceedings of the symposium on ETRA 2002: eye tracking research and applications symposium:125–132

  • Oka K, Sato Y, Nakanishi Y, Koike H (2005) Head pose estimation system based on particle filtering with adaptive diffusion control. The transactions of the Institute of Electronics, Information and Communication Engineers. D-II 88(8):1601–1613

    Google Scholar 

  • Von Raffler-Engel W (1980) Aspects of nonverbal communication. Swets & Zeitlinger, Lisse

    Google Scholar 

  • Yanagida Y, Kawato S, Noma H, Tetsutani N, Tomono A (2003) A nose-tracked, personal olfactory display. ACM SIGGRAPH 2003 sketches and applications:50–54

Download references

Acknowledgments

This research was partially supported by the Ministry of Education, Culture, Sports, Science and Technology through a Grant-in-Aid for Creative Scientific Research (No. 13GS0003) and Scientific Research on Priority Areas (No. 04560012).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yoshimasa Ohmoto.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ohmoto, Y., Ueda, K. & Ohno, T. Real-time system for measuring gaze direction and facial features: towards automatic discrimination of lies using diverse nonverbal information. AI & Soc 23, 187–200 (2009). https://doi.org/10.1007/s00146-007-0138-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00146-007-0138-x

Keywords

Navigation