Abstract
Post-industrial society, aligned with the model of the Fourth Industrial Revolution, has been providing technological advances in advanced robotics, IoT, self-driving vehicles, non-biological sentient life development, artificial intelligence (AI), machine learning or cognitive computing, among others. This fact determines the understanding that in the near future humans and machines will have a joint role to play.
Multimodality is one of the most important challenges in the field of human-computer interaction (HCI), as it provides an extension of the sensorimotor capabilities of computer systems so that they should replicate the processes of natural communication between humans.
There is a need for the development of increasingly intelligent interfaces, defined as those that promote naturalness, adding the benefits of adaptability, context-fitness and support for task development. There is also a growing desire for interface transparency as a mediator for HCI to make interactions that are truer and closer to reality. On the other hand, current research has been emphasizing the importance of affection and emotions, particularly, in the user’s experience with computer systems.
This paper proposes to understand the importance of multimodality, naturalness and transparency for affective computing and how they can synergistically contribute to the anthropomorphization of HCI. Through this understanding, we present design requirements for engineers, designers and IT professionals, and other interveners, for the development of multimodal interfaces.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)
Wiener, N.: The human use of Human Beings: Cybernetics and Society. Houghton Mifflin, Boston (1954)
Dutoit, T., Nigay, L., Schnaider, M.: J. Signal Process. 86(12), 3515–3517 (2006). Special issue on “Multimodal Human-computer Interfaces”
Coutaz, J., Caelen, J.: A taxonomy for multimedia and multimodal user interfaces. In: First European Research Consortium for Informatics and Mathematics Workshop on Multi-modal Human-Computer Interaction, Lisbon, Portugal (1991)
Besson, P.: A multimodal pattern recognition framework for speaker detection, Ph.D. thesis. École Polytechnique Fédérale de Lausanne, Lausanne, Suisse (2007)
DeWitte, A.E.: Investigation of Dynamic Three-dimensional Tangible Touchscreens: Usability and Feasibility. Rochester Institute of Technology, Rochester (2008)
Ghazanfar, A.A., Schroeder, C.E.: Is neocortex essentially multisensory? TRENDS in Cogn. Sci. 10(6), 278–285 (2006)
Hall, D.L., Llinas, J.: An introduction to multi-sensor data fusion. In: Proceedings of the 1998 IEEE International Symposium on Circuits and Systems, vol. 6, pp. 6–23. IEEE, Monterey, CA, USA (1998)
Dumas, B., Lalanne, D., Oviatt, S.: Multimodal interfaces: a survey of principles, models and frameworks. In: Lalanne, D., Kohlas, J. (eds.) Human Machine Interaction. LNCS, vol. 5440, pp. 3–26. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-00437-7_1
Anthony, L., Yang, J., Koedinger, K.R.: Evaluation of multimodal input for entering mathematical equations on the computer. In: van der Veer, G.C., Gale, C. (eds.) CHI 2005 Extended Abstracts on Human Factors in Computing Systems, pp. 1184–1187. ACM, Portland (2005)
Tzovaras, D.: Introduction. In: Tzovaras, D. (ed.) Multimodal User Interfaces: from Signals to Interaction, pp. 1–4. Springer, Leipzig, Germany (2008)
Landragin, F.: Physical, semantic and pragmatic levels for multimodal fusion and fission. In: Proceedings of the Seventh International Workshop on Computational Semantics (IWCS-7), pp. 346–350. Tilburg, Netherlands (2007)
Aran, O., Burger, T., Akarun, L., Caplier, A.: Gestural interfaces for hearing-impaired communication. In: Tzovaras, D. (ed.) Multimodal user interfaces: from signals to interaction, pp. 219–250. Springer, Leipzig, Germany (2008). https://doi.org/10.1007/978-3-540-78345-9_10
Blache, P., Bertrand, R., Ferré, G: Creating and exploiting multimodal annotated corpora. In: Proceedings of the Sixth International Conference on Language Resources and Evaluation (LREC 2008), pp. 1773–1777. European Language Resources Association (ELRA), Marrakech (2008)
Nigay, L., Coutaz, J.: A design space for multimodal systems. In: B. Arnold (ed.), Proceedings of the INTERACT 1993 and CHI 1993 Conference on Human Factors in Computing Systems, pp. 172–178. ACM, New York (1993)
Bourguet, M.-L.: An overview of multimodal interaction techniques and applications. In: Zaphiris, P., Ang, C.S. (eds.) Human Computer Interaction: Concepts, Methodologies, Tools, and Applications, pp. 95–101. Information Science Reference, New York, NY, USA (2009)
Martin, J.C., Veldman, R., Béroule, D.: Developing multimodal interfaces: a theoretical framework and guided propagation networks. In: Bunt, H., Beun, R.-J., Borghuis, T. (eds.) CMC 1995. LNCS, vol. 1374, pp. 158–187. Springer, Heidelberg (1998). https://doi.org/10.1007/BFb0052318
Ferri, F., Paolozzi, S.: Analyzing multimodal interaction. In: Grifoni, P. (ed.) Multimodal Human Computer Interaction and Pervasive Services, pp. 19–33. IGI Global, Hershey (2009)
James, F., Gurram, R.: Multimodal and federated interaction. In: Zaphiris, P., Ang, C.S. (eds.) Human Computer Interaction: Concepts, methodologies, Tools, and Applications, pp. 102–122. Information Science Reference, New York, USA (2009)
Oviatt, S., et al.: Designing the user interface for multimodal speech and pen-based gesture applications: state of the art systems and future research directions. Hum.-Comput. Interact. 15(4), 263–322 (2000)
Bernsen, N.O.: Modality theory in support of multimodal interface design. In: Hayes-Roth, B., Korf, R.E. (eds.) AAAI Spring Symposium on Intelligent Multi-Modal Systems, pp. 37–44. AAAI Press, Menlo Park (1994)
Bernsen, N.O., Dybkjær, L.: CLASS Natural and Multimodal Interactivity Deliverable D1.5+6. Best practice in natural and multimodal interactivity engineering. NISLab, University of Southern Denmark (2003)
Maybury, M., Wahlster, W.: Intelligent user interfaces: an introduction. In: Maybury, M., e Wahlster, W. (eds.) Readings in intelligent user interfaces, pp. 1–13. Morgan Kaufmann Publishers, San Francisco (1998)
Yin, Y.: Toward an Intelligent Multimodal Interface for Natural Interaction. Thesis submitted to The Massachusetts Institute of Technology for the degree of Master of Sciences, Massachusetts, USA (2010)
Norman, D.: Why Interfaces Don’t Work. In: Laurel, B. (ed.) The Art of Human-Computer Interface Design, pp. 209–219. Addison-Wesley Professional, Massachusetts (1990)
Bolter, J.D., Grusin, R.: Remediation: Understanding New Media. The MIT Press, Cambridge (2000)
Draude, C.: Computing Bodies: Gender Codes and Anthropomorphic Design at the Human-Computer Interface. Springer, Kassel, Germany (2017). https://doi.org/10.1007/978-3-658-18660-9
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Rafael, S. (2020). Multimodality, Naturalness and Transparency in Affective Computing for HCI. In: Marcus, A., Rosenzweig, E. (eds) Design, User Experience, and Usability. Interaction Design. HCII 2020. Lecture Notes in Computer Science(), vol 12200. Springer, Cham. https://doi.org/10.1007/978-3-030-49713-2_36
Download citation
DOI: https://doi.org/10.1007/978-3-030-49713-2_36
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-49712-5
Online ISBN: 978-3-030-49713-2
eBook Packages: Computer ScienceComputer Science (R0)