Skip to main content

Multimodality, Naturalness and Transparency in Affective Computing for HCI

  • Conference paper
  • First Online:
Design, User Experience, and Usability. Interaction Design (HCII 2020)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12200))

Included in the following conference series:

Abstract

Post-industrial society, aligned with the model of the Fourth Industrial Revolution, has been providing technological advances in advanced robotics, IoT, self-driving vehicles, non-biological sentient life development, artificial intelligence (AI), machine learning or cognitive computing, among others. This fact determines the understanding that in the near future humans and machines will have a joint role to play.

Multimodality is one of the most important challenges in the field of human-computer interaction (HCI), as it provides an extension of the sensorimotor capabilities of computer systems so that they should replicate the processes of natural communication between humans.

There is a need for the development of increasingly intelligent interfaces, defined as those that promote naturalness, adding the benefits of adaptability, context-fitness and support for task development. There is also a growing desire for interface transparency as a mediator for HCI to make interactions that are truer and closer to reality. On the other hand, current research has been emphasizing the importance of affection and emotions, particularly, in the user’s experience with computer systems.

This paper proposes to understand the importance of multimodality, naturalness and transparency for affective computing and how they can synergistically contribute to the anthropomorphization of HCI. Through this understanding, we present design requirements for engineers, designers and IT professionals, and other interveners, for the development of multimodal interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)

    Google Scholar 

  2. Wiener, N.: The human use of Human Beings: Cybernetics and Society. Houghton Mifflin, Boston (1954)

    Google Scholar 

  3. Dutoit, T., Nigay, L., Schnaider, M.: J. Signal Process. 86(12), 3515–3517 (2006). Special issue on “Multimodal Human-computer Interfaces”

    Article  Google Scholar 

  4. Coutaz, J., Caelen, J.: A taxonomy for multimedia and multimodal user interfaces. In: First European Research Consortium for Informatics and Mathematics Workshop on Multi-modal Human-Computer Interaction, Lisbon, Portugal (1991)

    Google Scholar 

  5. Besson, P.: A multimodal pattern recognition framework for speaker detection, Ph.D. thesis. École Polytechnique Fédérale de Lausanne, Lausanne, Suisse (2007)

    Google Scholar 

  6. DeWitte, A.E.: Investigation of Dynamic Three-dimensional Tangible Touchscreens: Usability and Feasibility. Rochester Institute of Technology, Rochester (2008)

    Google Scholar 

  7. Ghazanfar, A.A., Schroeder, C.E.: Is neocortex essentially multisensory? TRENDS in Cogn. Sci. 10(6), 278–285 (2006)

    Article  Google Scholar 

  8. Hall, D.L., Llinas, J.: An introduction to multi-sensor data fusion. In: Proceedings of the 1998 IEEE International Symposium on Circuits and Systems, vol. 6, pp. 6–23. IEEE, Monterey, CA, USA (1998)

    Google Scholar 

  9. Dumas, B., Lalanne, D., Oviatt, S.: Multimodal interfaces: a survey of principles, models and frameworks. In: Lalanne, D., Kohlas, J. (eds.) Human Machine Interaction. LNCS, vol. 5440, pp. 3–26. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-00437-7_1

    Chapter  Google Scholar 

  10. Anthony, L., Yang, J., Koedinger, K.R.: Evaluation of multimodal input for entering mathematical equations on the computer. In: van der Veer, G.C., Gale, C. (eds.) CHI 2005 Extended Abstracts on Human Factors in Computing Systems, pp. 1184–1187. ACM, Portland (2005)

    Google Scholar 

  11. Tzovaras, D.: Introduction. In: Tzovaras, D. (ed.) Multimodal User Interfaces: from Signals to Interaction, pp. 1–4. Springer, Leipzig, Germany (2008)

    Google Scholar 

  12. Landragin, F.: Physical, semantic and pragmatic levels for multimodal fusion and fission. In: Proceedings of the Seventh International Workshop on Computational Semantics (IWCS-7), pp. 346–350. Tilburg, Netherlands (2007)

    Google Scholar 

  13. Aran, O., Burger, T., Akarun, L., Caplier, A.: Gestural interfaces for hearing-impaired communication. In: Tzovaras, D. (ed.) Multimodal user interfaces: from signals to interaction, pp. 219–250. Springer, Leipzig, Germany (2008). https://doi.org/10.1007/978-3-540-78345-9_10

    Chapter  Google Scholar 

  14. Blache, P., Bertrand, R., Ferré, G: Creating and exploiting multimodal annotated corpora. In: Proceedings of the Sixth International Conference on Language Resources and Evaluation (LREC 2008), pp. 1773–1777. European Language Resources Association (ELRA), Marrakech (2008)

    Google Scholar 

  15. Nigay, L., Coutaz, J.: A design space for multimodal systems. In: B. Arnold (ed.), Proceedings of the INTERACT 1993 and CHI 1993 Conference on Human Factors in Computing Systems, pp. 172–178. ACM, New York (1993)

    Google Scholar 

  16. Bourguet, M.-L.: An overview of multimodal interaction techniques and applications. In: Zaphiris, P., Ang, C.S. (eds.) Human Computer Interaction: Concepts, Methodologies, Tools, and Applications, pp. 95–101. Information Science Reference, New York, NY, USA (2009)

    Chapter  Google Scholar 

  17. Martin, J.C., Veldman, R., Béroule, D.: Developing multimodal interfaces: a theoretical framework and guided propagation networks. In: Bunt, H., Beun, R.-J., Borghuis, T. (eds.) CMC 1995. LNCS, vol. 1374, pp. 158–187. Springer, Heidelberg (1998). https://doi.org/10.1007/BFb0052318

    Chapter  Google Scholar 

  18. Ferri, F., Paolozzi, S.: Analyzing multimodal interaction. In: Grifoni, P. (ed.) Multimodal Human Computer Interaction and Pervasive Services, pp. 19–33. IGI Global, Hershey (2009)

    Chapter  Google Scholar 

  19. James, F., Gurram, R.: Multimodal and federated interaction. In: Zaphiris, P., Ang, C.S. (eds.) Human Computer Interaction: Concepts, methodologies, Tools, and Applications, pp. 102–122. Information Science Reference, New York, USA (2009)

    Chapter  Google Scholar 

  20. Oviatt, S., et al.: Designing the user interface for multimodal speech and pen-based gesture applications: state of the art systems and future research directions. Hum.-Comput. Interact. 15(4), 263–322 (2000)

    Article  Google Scholar 

  21. Bernsen, N.O.: Modality theory in support of multimodal interface design. In: Hayes-Roth, B., Korf, R.E. (eds.) AAAI Spring Symposium on Intelligent Multi-Modal Systems, pp. 37–44. AAAI Press, Menlo Park (1994)

    Google Scholar 

  22. Bernsen, N.O., Dybkjær, L.: CLASS Natural and Multimodal Interactivity Deliverable D1.5+6. Best practice in natural and multimodal interactivity engineering. NISLab, University of Southern Denmark (2003)

    Google Scholar 

  23. Maybury, M., Wahlster, W.: Intelligent user interfaces: an introduction. In: Maybury, M., e Wahlster, W. (eds.) Readings in intelligent user interfaces, pp. 1–13. Morgan Kaufmann Publishers, San Francisco (1998)

    Google Scholar 

  24. Yin, Y.: Toward an Intelligent Multimodal Interface for Natural Interaction. Thesis submitted to The Massachusetts Institute of Technology for the degree of Master of Sciences, Massachusetts, USA (2010)

    Google Scholar 

  25. Norman, D.: Why Interfaces Don’t Work. In: Laurel, B. (ed.) The Art of Human-Computer Interface Design, pp. 209–219. Addison-Wesley Professional, Massachusetts (1990)

    Google Scholar 

  26. Bolter, J.D., Grusin, R.: Remediation: Understanding New Media. The MIT Press, Cambridge (2000)

    Google Scholar 

  27. Draude, C.: Computing Bodies: Gender Codes and Anthropomorphic Design at the Human-Computer Interface. Springer, Kassel, Germany (2017). https://doi.org/10.1007/978-3-658-18660-9

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sónia Rafael .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rafael, S. (2020). Multimodality, Naturalness and Transparency in Affective Computing for HCI. In: Marcus, A., Rosenzweig, E. (eds) Design, User Experience, and Usability. Interaction Design. HCII 2020. Lecture Notes in Computer Science(), vol 12200. Springer, Cham. https://doi.org/10.1007/978-3-030-49713-2_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-49713-2_36

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-49712-5

  • Online ISBN: 978-3-030-49713-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics