AutoSelect: What You Want Is What You Get: Real-Time Processing of Visual Attention and Affect

  • Nikolaus Bee
  • Helmut Prendinger
  • Arturo Nakasone
  • Elisabeth André
  • Mitsuru Ishizuka
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4021)


While objects of our focus of attention (“where we are looking at”) and accompanying affective responses to those objects is part of our daily experience, little research exists on investigating the relation between attention and positive affective evaluation. The purpose of our research is to process users’ emotion and attention in real-time, with the goal of designing systems that may recognize a user’s affective response to a particular visually presented stimulus in the presence of other stimuli, and respond accordingly. In this paper, we introduce the AutoSelect system that automatically detects a user’s preference based on eye movement data and physiological signals in a two-alternative forced choice task. In an exploratory study involving the selection of neckties, the system could correctly classify subjects’ choice of in 81%. In this instance of AutoSelect, the gaze ‘cascade effect’ played a dominant role, whereas pupil size could not be shown as a reliable predictor of preference.


Skin Conduc Pupil Size Skin Conduc Response Blood Volume Pulse Kansei Engineering 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [Andreassi, 2000]
    Andreassi, J.L.: Psychophysiology. Human Behavior & Physiological Response, 4th edn. Lawrence Erlbaum Associates, Mahwah (2000)Google Scholar
  2. [Bechara et al., 1997]
    Bechara, A., Damasio, H., Tranel, D., Damasio, A.R.: Deciding advantageously before knowing the advantageous strategy. Science 275, 1293–1295 (1997)CrossRefGoogle Scholar
  3. [Busso et al., 2004]
    Busso, C., Deng, Z., Yildirim, S., Buut, M., Lee, C.M., Kazemzadeh, A., Lee, S., Neumann, U., Narayanan, S.: Analysis of emotion recognition using facial expressions, speech and multimodal information. In: Proceedings of 6th International Conference on Multimodal Interfaces (ICMI 2004), pp. 205–211. ACM Press, New York (2004)CrossRefGoogle Scholar
  4. [Duchowski, 2003]
    Duchowski, A.T.: Eye Tracking Methodology: Theory and Practice. Springer, London (2003)MATHGoogle Scholar
  5. [Ekman, 1992]
    Ekman, P.: An argument for basic emotions. Cognition and Emotion 6(3–4), 169–200 (1992)Google Scholar
  6. [Ekman et al., 1983]
    Ekman, P., Levenson, R.W., Friesen, W.V.: Autonomic nervous system activity distinguishes among emotions. Science 221, 1208–1210 (1983)CrossRefGoogle Scholar
  7. [Feldman-Barrett, 2006]
    Feldman-Barrett, L.: Emotions as natural kinds? Perspectives on Psychological Science 1, 28–58 (2006)CrossRefGoogle Scholar
  8. [Healey, 2000]
    Healey, J.A.: Wearable and Automotive Systems for Affect Recognition from Physiology. PhD thesis, Massachusetts Institute of Technology (2000)Google Scholar
  9. [Hess, 1972]
    Hess, E.H.: Pupillometrics: A method of studying mental, emotional and sensory processes. In: Greenfield, N., Sternbach, R. (eds.) Handbook of Psychophysiology, pp. 491–531. Holt, Rinehart & Winston, New York (1972)Google Scholar
  10. [Kim et al., 2005]
    Kim, J., André, E., Rehm, M., Vogt, T., Wagner, J.: Integrating information from speech and physiological signals to achieve emotional sensitivity. In: Proceedings 9th European Conference on Speech Communication and Technology (2005)Google Scholar
  11. [Kon et al., 2006]
    Kon, M., Koshizen, T., Prendinger, H.: A new user–machine interface using cross-modal computation for deep interest estimation. Towards quantifying user satisfaction. In: Proceedings of IUI 2006 Workshop on Effective Multimodal Dialogue Interfaces, pp. 25–34 (2006)Google Scholar
  12. [Krugman, 1964]
    Krugman, H.: Some applications of pupil measurement. Journal of Marketing Research 1, 15–19 (1964)CrossRefGoogle Scholar
  13. [Lang, 1995]
    Lang, P.J.: The emotion probe: Studies of motivation and attention. American Psychologist 50(5), 372–385 (1995)CrossRefGoogle Scholar
  14. [Levenson, 1988]
    Levenson, R.W.: Emotion and the autonomic nervous system: A prospectus for research on autonomic specificity. In: Wagner, H.L. (ed.) Social Psychophysiology and Emotion: Theory and Clinical Applications, pp. 17–42. John Wiley & Sons, Hoboken (1988)Google Scholar
  15. [Norsys, 2003]
    Norsys, Norsys Software Corp. Netica (2003), URL:
  16. [Picard, 1997]
    Picard, R.W.: Affective Computing. The MIT Press, Cambridge (1997)Google Scholar
  17. [Prendinger and Ishizuka, 2004]
    Prendinger, H., Ishizuka, M. (eds.): Life-Like Characters. Tools, Affective Functions, and Applications. Cognitive Technologies. Springer, Heidelberg (2004)Google Scholar
  18. [Prendinger and Ishizuka, 2005]
    Prendinger, H., Ishizuka, M.: The Empathic Companion: A character-based interface that addresses users’ affective states. International Journal of Applied Artificial Intelligence 19(3), 267–285 (2005)CrossRefGoogle Scholar
  19. [Schultheis and Jameson, 2004]
    Schultheis, H., Jameson, A.: Assessing cognitive load in adaptive hypermedia systems: Physiological and behavioral methods. In: De Bra, P.M.E., Nejdl, W. (eds.) AH 2004. LNCS, vol. 3137, pp. 225–234. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  20. [Seeing Machines, 2005]
    Seeing Machines (2005), URL:
  21. [Shimojo et al., 2003]
    Shimojo, S., Simion, C., Shimojo, E., Scheier, C.: Gaze bias both reflects and influences preference. Nature Neuroscience 6(12), 1317–1322 (2003)CrossRefGoogle Scholar
  22. [Simion, 2005]
    Simion, C.: Orienting and Preference: An Enquiry into the Mechanisms Underlying Emotional Decision Making. PhD thesis, California Institute of Technology (2005)Google Scholar
  23. [Streitz and Nixon, 2005]
    Streitz, N., Nixon, P.: The Disappearing Computer. Guest editors’ introduction to Special Issue. Communications of the ACM 48, 33–35 (2005)CrossRefGoogle Scholar
  24. [, 2006], Kansei engineering: Incorporating affection and emotion into the design process (2006), URL:
  25. [Thought Technology, 2005]
    Thought Technology, Thought Technology Ltd. (2005), URL:
  26. [Wahlster, 2003]
    Wahlster, W.: Towards symmetric multimodality: Fusion and fission of speech, gesture and facial expression. In: Günter, A., Kruse, R., Neumann, B. (eds.) KI 2003. LNCS (LNAI), vol. 2821, pp. 1–18. Springer, Heidelberg (2003)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Nikolaus Bee
    • 1
  • Helmut Prendinger
    • 2
  • Arturo Nakasone
    • 3
  • Elisabeth André
    • 1
  • Mitsuru Ishizuka
    • 3
  1. 1.Institute of Computer ScienceUniversity of AugsburgAugsburgGermany
  2. 2.National Institute of InformaticsTokyoJapan
  3. 3.Graduate School of Information Science and TechnologyUniversity of TokyoTokyoJapan

Personalised recommendations