Skip to main content

AutoSelect: What You Want Is What You Get: Real-Time Processing of Visual Attention and Affect

  • Conference paper
Perception and Interactive Technologies (PIT 2006)

Abstract

While objects of our focus of attention (“where we are looking at”) and accompanying affective responses to those objects is part of our daily experience, little research exists on investigating the relation between attention and positive affective evaluation. The purpose of our research is to process users’ emotion and attention in real-time, with the goal of designing systems that may recognize a user’s affective response to a particular visually presented stimulus in the presence of other stimuli, and respond accordingly. In this paper, we introduce the AutoSelect system that automatically detects a user’s preference based on eye movement data and physiological signals in a two-alternative forced choice task. In an exploratory study involving the selection of neckties, the system could correctly classify subjects’ choice of in 81%. In this instance of AutoSelect, the gaze ‘cascade effect’ played a dominant role, whereas pupil size could not be shown as a reliable predictor of preference.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Andreassi, J.L.: Psychophysiology. Human Behavior & Physiological Response, 4th edn. Lawrence Erlbaum Associates, Mahwah (2000)

    Google Scholar 

  2. Bechara, A., Damasio, H., Tranel, D., Damasio, A.R.: Deciding advantageously before knowing the advantageous strategy. Science 275, 1293–1295 (1997)

    Article  Google Scholar 

  3. Busso, C., Deng, Z., Yildirim, S., Buut, M., Lee, C.M., Kazemzadeh, A., Lee, S., Neumann, U., Narayanan, S.: Analysis of emotion recognition using facial expressions, speech and multimodal information. In: Proceedings of 6th International Conference on Multimodal Interfaces (ICMI 2004), pp. 205–211. ACM Press, New York (2004)

    Chapter  Google Scholar 

  4. Duchowski, A.T.: Eye Tracking Methodology: Theory and Practice. Springer, London (2003)

    MATH  Google Scholar 

  5. Ekman, P.: An argument for basic emotions. Cognition and Emotion 6(3–4), 169–200 (1992)

    Google Scholar 

  6. Ekman, P., Levenson, R.W., Friesen, W.V.: Autonomic nervous system activity distinguishes among emotions. Science 221, 1208–1210 (1983)

    Article  Google Scholar 

  7. Feldman-Barrett, L.: Emotions as natural kinds? Perspectives on Psychological Science 1, 28–58 (2006)

    Article  Google Scholar 

  8. Healey, J.A.: Wearable and Automotive Systems for Affect Recognition from Physiology. PhD thesis, Massachusetts Institute of Technology (2000)

    Google Scholar 

  9. Hess, E.H.: Pupillometrics: A method of studying mental, emotional and sensory processes. In: Greenfield, N., Sternbach, R. (eds.) Handbook of Psychophysiology, pp. 491–531. Holt, Rinehart & Winston, New York (1972)

    Google Scholar 

  10. Kim, J., André, E., Rehm, M., Vogt, T., Wagner, J.: Integrating information from speech and physiological signals to achieve emotional sensitivity. In: Proceedings 9th European Conference on Speech Communication and Technology (2005)

    Google Scholar 

  11. Kon, M., Koshizen, T., Prendinger, H.: A new user–machine interface using cross-modal computation for deep interest estimation. Towards quantifying user satisfaction. In: Proceedings of IUI 2006 Workshop on Effective Multimodal Dialogue Interfaces, pp. 25–34 (2006)

    Google Scholar 

  12. Krugman, H.: Some applications of pupil measurement. Journal of Marketing Research 1, 15–19 (1964)

    Article  Google Scholar 

  13. Lang, P.J.: The emotion probe: Studies of motivation and attention. American Psychologist 50(5), 372–385 (1995)

    Article  Google Scholar 

  14. Levenson, R.W.: Emotion and the autonomic nervous system: A prospectus for research on autonomic specificity. In: Wagner, H.L. (ed.) Social Psychophysiology and Emotion: Theory and Clinical Applications, pp. 17–42. John Wiley & Sons, Hoboken (1988)

    Google Scholar 

  15. Norsys, Norsys Software Corp. Netica (2003), URL: http://www.norsys.com

  16. Picard, R.W.: Affective Computing. The MIT Press, Cambridge (1997)

    Google Scholar 

  17. Prendinger, H., Ishizuka, M. (eds.): Life-Like Characters. Tools, Affective Functions, and Applications. Cognitive Technologies. Springer, Heidelberg (2004)

    Google Scholar 

  18. Prendinger, H., Ishizuka, M.: The Empathic Companion: A character-based interface that addresses users’ affective states. International Journal of Applied Artificial Intelligence 19(3), 267–285 (2005)

    Article  Google Scholar 

  19. Schultheis, H., Jameson, A.: Assessing cognitive load in adaptive hypermedia systems: Physiological and behavioral methods. In: De Bra, P.M.E., Nejdl, W. (eds.) AH 2004. LNCS, vol. 3137, pp. 225–234. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  20. Seeing Machines (2005), URL: http://www.seeingmachines.com/

  21. Shimojo, S., Simion, C., Shimojo, E., Scheier, C.: Gaze bias both reflects and influences preference. Nature Neuroscience 6(12), 1317–1322 (2003)

    Article  Google Scholar 

  22. Simion, C.: Orienting and Preference: An Enquiry into the Mechanisms Underlying Emotional Decision Making. PhD thesis, California Institute of Technology (2005)

    Google Scholar 

  23. Streitz, N., Nixon, P.: The Disappearing Computer. Guest editors’ introduction to Special Issue. Communications of the ACM 48, 33–35 (2005)

    Article  Google Scholar 

  24. terremoto.net, Kansei engineering: Incorporating affection and emotion into the design process (2006), URL: http://terremoto.net/kansei/

  25. Thought Technology, Thought Technology Ltd. (2005), URL: http://www.thoughttechnology.com

  26. Wahlster, W.: Towards symmetric multimodality: Fusion and fission of speech, gesture and facial expression. In: Günter, A., Kruse, R., Neumann, B. (eds.) KI 2003. LNCS (LNAI), vol. 2821, pp. 1–18. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bee, N., Prendinger, H., Nakasone, A., André, E., Ishizuka, M. (2006). AutoSelect: What You Want Is What You Get: Real-Time Processing of Visual Attention and Affect. In: André, E., Dybkjær, L., Minker, W., Neumann, H., Weber, M. (eds) Perception and Interactive Technologies. PIT 2006. Lecture Notes in Computer Science(), vol 4021. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11768029_5

Download citation

  • DOI: https://doi.org/10.1007/11768029_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34743-9

  • Online ISBN: 978-3-540-34744-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics