Abstract
Most work on multimodal interaction in the human computer interaction (HCI) space has focused on enabling a user to use one or more modalities in combination to interact with a system. However, there is still a long way to go towards making human-to-machine communication as rich and intuitive as human-to-human communication. In human-to-human communication, modalities are used individually, simultaneously, interchangeably or in combination. The choice of modalities is dependent on a variety of factors including the context of conversation, social distance, physical proximity, duration, etc. We believe such intuitive multimodal communication is the direction in which human-to-machine interaction is headed in the future. In this paper, we present the insights we have from studying current human-machine interaction methods. We carried out an ethnographic study to observe and study users in their homes as they interacted with media and media devices, by themselves and in small groups. One of the key learning we have from this study is the understanding of the impact of the user’s context on the choice of interaction modalities. The user context factors that influence the choice of interaction modalities include, but are not limited to: the distance of the user from the device/media, the user’s body posture during the media interaction, the user’s involvement level with the media, seating patterns (cluster) of the co-located participants, the roles that each participant plays, the notion of control among the participants, duration of the activity and so on. We believe that the insights from this study can inform the design of the next generation multimodal interfaces that are sensitive to user context, perform a robust interpretation of the interaction inputs and support more human-like multimodal interaction.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Ballendat, T., Marquardt, N., Greenberg, S.: Proxemic interaction: designing for a proximity and orientation-aware environment. In: ACM International Conference on Interactive Tabletops and Surfaces (ITS 2010). ACM, New York (2010)
Batterbee, K., Koskinen, I.: Co-experience: user experience as interaction, vol. 1(1). Taylor & Francis CoDesign (2005)
Bernhaupt, R., Obrist, M., Weiss, A., Beck, E., Tscheligi, M.: Trends in the living room and beyond: results from ethnographic studies using creative and playful probing. Computers in Entertainment - Social Television and user Interaction 6(1) (January 2008)
Dumas, B., Lalanne, D., Oviatt, S.: Multimodal Interfaces: A Survey of Principles, Models and Frameworks. In: Lalanne, D., Kohlas, J. (eds.) Human Machine Interaction. LNCS, vol. 5440, pp. 3–26. Springer, Heidelberg (2009)
Hornecker, E., Marshall, P., Rogers, Y.: From entry to access: how shareability comes about. In: Proc. of the Conference on Designing Pleasurable Products and Interfaces (2007)
Inkpen, K.M., Hawkey, K., Kellar, M., Mandryk, R.L., Parker, J.K., Reilly, D., Scott, S.D., Whalen, T.: Exploring Display Factors that Influence Co-Located Collaboration: Angle, Size, Number, and User Arrangement. In: Proceedings of HCI International 2005, Las Vegas, USA (July 2005)
Jaimes, A., Sebe, N.: Multimodal Human Computer Interaction: A Survey. In: IEEE International Workshop on Human Computer Interaction in Conjunction with ICCV (2005)
Madhvanath, S., Vennelakanti, R., Subramanian, A., Shekhawat, A., Dey, P., Rajan, A.: Designing multiuser multimodal gestural interactions for the living room. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction (ICMI 2012), pp. 61–62. ACM, New York (2012)
Myers, B., Chuang, Y., Tjandra, M., Chen, M., Lee, C.: Floor control in a Highly Collaborative Co-Located Task (2004), http://www-2.cs.cmu.edu/~pebbles/papers/pebblesfloorcontrol.pdf (accessed: June 01, 2009)
Oviatt, S., Cohen, P.R.: Multimodal Interfaces That Process What Comes Naturally. Communications of the ACM (2000)
Reeves, L.M., Lai, J., Larson, J.A., Oviatt, S., Balaji, T.S., Buisine, S.P., Collings, P., Cohen, P.R., Kraal, B., Martin, J.-C., McTear, M., Raman, T., Stanney, K.M., Su, H., Wang, Q.Y.: Guidelines for multimodal user interface design. Communications of the ACM 47(1), 57–59 (2004)
Turk, M., Bailenson, J., Beall, A., Blascovich, J., Guadagno, R.: Multimodal Transformed Social Interaction. In: ICMI (2004)
Vennelakanti, R., Dey, P., Shekhawat, A., Phanindra, P.: The Picture says it all! Multimodal Interactions and Interaction Metadata. In: 13th International Conference on Multimodal Interaction, Alicante, Spain, November 14-18 (2011)
Vennelakanti, R., Madhvanath, S., Subramanian, A., Sowndararajan, A., David, A., Dey, P.: Pixene: Creating memories while sharing photos. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Vennelakanti, R., Subramanian, A., Madhvanath, S., Dey, P. (2013). Factors of Influence in Co-located Multimodal Interactions. In: Agrawal, A., Tripathi, R.C., Do, E.YL., Tiwari, M.D. (eds) Intelligent Interactive Technologies and Multimedia. IITM 2013. Communications in Computer and Information Science, vol 276. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-37463-0_21
Download citation
DOI: https://doi.org/10.1007/978-3-642-37463-0_21
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-37462-3
Online ISBN: 978-3-642-37463-0
eBook Packages: Computer ScienceComputer Science (R0)