HCI and the Face: Towards an Art of the Soluble

  • Christoph Bartneck
  • Michael J. Lyons
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4550)


The human face plays a central role in most forms of natural human interaction so we may expect that computational methods for analysis of facial information and graphical and robotic methods for synthesis of faces and facial expressions will play a growing role in human-computer and human-robot interaction. However, certain areas of face-based HCI, such as facial expression recognition and robotic facial display have lagged others, such as eye-gaze tracking, facial recognition, and conversational characters. Our goal in this paper is to review the situation in HCI with regards to the human face, and to discuss strategies which could bring more slowly developing areas up to speed.


face hci soluble recognition synthesis 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Lyons, M.J.: Facial Gesture Interfaces for Expression and Communication. In: IEEE International Conference on Systems, Man and Cybernetics, The Hague (2004)Google Scholar
  2. 2.
    Lyons, M.J., Budynek, J., Akamatsu, S.: Automatic Classification of Single Facial Images. IEEE PAMI 21, 1357–1362 (1999)Google Scholar
  3. 3.
    Cassell, J., Sullivan, J., Prevost, S., Churchill, E.: Embodied Conversational Agents. MIT Press, Cambridge (2000)Google Scholar
  4. 4.
    Bartneck, C., Okada, M.: Robotic User Interfaces, HC2001, Aizu (2001)Google Scholar
  5. 5.
    Bartneck, C., Suzuki, N.: Subtle Expressivity for Characters and Robots. International Journal of Human Computer Studies 62, 306 (2004)Google Scholar
  6. 6.
    Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robotics and Autonomous Systems 42, 143–166 (2003)zbMATHCrossRefGoogle Scholar
  7. 7.
    Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (MAGIC) pointing presented at ACM CHI 1999 (1999)Google Scholar
  8. 8.
    Tobii Technology, Tobii Technology (2007)Google Scholar
  9. 9.
    Picard, R.W.: Affective computing. MIT Press, Cambridge (1997)Google Scholar
  10. 10.
    Pantic, M., Rothkrantz, L.J.M.: Automatic analysis of facial expressions: the state of the art. IEEE PAMI 22, 1424–1445 (2000)Google Scholar
  11. 11.
    Fasel, B., Luettin, J.: Automatic facial expression analysis: a survey. Pattern Recognition 36, 259–275 (2003)zbMATHCrossRefGoogle Scholar
  12. 12.
    Ekman, P., Friesen, W.V.: Unmasking the Face. Prentice-Hall, Englewood Cliffs (1975)Google Scholar
  13. 13.
    Bartneck, C., Reichenbach, J., Breemen, A.: In your face, robot! The influence of a character’s embodiment on how users perceive its expressions, Design and Emotion (2004)Google Scholar
  14. 14.
    Cerami, F.: Miss Digital World (2006)Google Scholar
  15. 15.
    Mori, M.: The Uncanny Valley. Energy 7, 33–35 (1970)Google Scholar
  16. 16.
    Ishiguro, H.: Towards a new cross-interdisciplinary framework, presented at CogSci Workshop Towards social Mechanisms of android science, Stresa (2005)Google Scholar
  17. 17.
    Bartneck, C.: Interacting with an Embodied Emotional Character. In: Design for Pleasurable Products Conference (DPPI 2004), Pittsburgh (2003)Google Scholar
  18. 18.
    Honda: ”Asimo” (2002)Google Scholar
  19. 19.
    Sony: ”Aibo”, vol. 1999 (1999) Google Scholar
  20. 20.
    NEC: ”PaPeRo” (2001)Google Scholar
  21. 21.
    Breemen, A., Yan, X., Meerbeek, B.: iCat: an animated user-interface robot with personality. In: 4th Intl. Conference on Autonomous Agents & Multi Agent Systems (2005)Google Scholar
  22. 22.
    Schiano, D.J.: Categorical Imperative NOT: Facial Affect is Perceived Continously. In: ACM CHI 2004 (2004)Google Scholar
  23. 23.
    Russell, J.A.: Affective space is bipolar. Journal of personality and social psychology 37, 345–356 (1979)CrossRefGoogle Scholar
  24. 24.
    Bartneck, C.: How convincing is Mr. Data’s smile: Affective expressions of machines. User Modeling and User-Adapted Interaction 11, 279–295 (2001)zbMATHCrossRefGoogle Scholar
  25. 25.
    Pelachaud, C.: Multimodal expressive embodied conversational agents, In: Proceedings of the 13th annual ACM international conference on Multimedia (2005)Google Scholar
  26. 26.
    Bandai: ”Tamagotchi” (2000)Google Scholar
  27. 27.
    Lund, H.H., Nielsen, J.: An Edutainment Robotics Survey. In: 3rd Intl. Symposium on Human and Artificial Intelligence Systems (2002)Google Scholar
  28. 28.
    Fogg, B.J.: Persuasive technology: using computers to change what we think and do. Morgan Kaufmann Publishers, Amsterdam, Boston (2003)Google Scholar
  29. 29.
    Catherine, Z., Paula, G., Larry, H.: Can a virtual cat persuade you?: The role of gender and realism in speaker persuasiveness. In: ACM CHI 2006 (2006)Google Scholar
  30. 30.
    Biocca, F.: The cyborg’s dilemma: embodiment in virtual environments. In: 2nd Intl. Conference on Cognitive Technology - Humanizing the Information Age (1997)Google Scholar
  31. 31.
    Schwienhorst, K.: The State of VR: A Meta-Analysis of Virtual Reality Tools in Second Language Acquisition. Computer Assisted Language Learning 15, 221–239 (2002)CrossRefGoogle Scholar
  32. 32.
    Mahmood, A.K., Ferneley, E.: Can Avatars Replace The Trainer? A case study evaluation. In: International Conference on Enterprise Information Systems (ICEIS), Porto (2004)Google Scholar
  33. 33.
    Tamura, T., Yonemitsu, S., Itoh, A., Oikawa, D., Kawakami, A., Higashi, Y., Fujimooto, T., Nakajima, K.: Is an entertainment robot useful in the care of elderly people with severe dementia? The. Journals of Gerontology Series A 59, M83–M85 (2004)Google Scholar
  34. 34.
    Wiratanaya, A., Lyons, M.J., Abe, S.: An interactive character animation system for dementia care, Research poster, ACM SIGGRAPH (2006)Google Scholar
  35. 35.
    Robins, B., Dautenhahn, K., Boekhorst, R., t. Boekhorst, R., Billard, A.: Robotic Assistants in Therapy and Education of Children with Autism: Can a Small Humanoid Robot Help Encourage Social Interaction Skills? In: UAIS, 4(2), 1–20. Springer, Heidelberg (2005)Google Scholar
  36. 36.
    Searle, J.R.: Minds, brains and programs. Behavioral and Brain Sciences 3, 417–457 (1980)CrossRefGoogle Scholar
  37. 37.
    McDermott, D.: Yes, Computers Can Think, in New York Times (1997)Google Scholar
  38. 38.
    Medawar, P.B.: The art of the soluble. Methuen, London (1967)Google Scholar
  39. 39.
    Cycorp: ”Cyc” (2007)Google Scholar
  40. 40.
    MacDorman, K.F.: Subjective ratings of robot video clips for human likeness, familiarity, and eeriness: An exploration of the uncanny valley, ICCS/CogSci-2006 (2006)Google Scholar
  41. 41.
    de Silva, G.C., Lyons, M.J., Kawato, S., Tetsutani, N.: Human Factors Evaluation of a Vision-Based Facial Gesture Interface, IEEE CVPR (2003)Google Scholar
  42. 42.
    Lyons, M.J., Chan, C., Tetsutani, N.: MouthType: Text Entry by Hand and Mouth. In: ACM CHI 2004 (2004)Google Scholar
  43. 43.
    Lyons, M.J., Tetsutani, N.: Facing the Music: A Facial Action Controlled Musical Interface. In: ACM CHI 2001 (2001)Google Scholar
  44. 44.
    Chan, C., Lyons, M.J., Tetsutani, N.: Mouthbrush: Drawing and Painting by Hand and Mouth. In: ACM ICMI-PUI 2003 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Christoph Bartneck
    • 1
  • Michael J. Lyons
    • 2
  1. 1.Department of Industrial Design, Eindhoven University of Technology, Den Dolech 2, 5600 MB EindhovenThe Netherlands
  2. 2.ATR Intelligent Robotics and Communication Labs, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288Japan

Personalised recommendations