Advertisement

Journal on Multimodal User Interfaces

, Volume 3, Issue 3, pp 157–165 | Cite as

Distinctive aspects of mobile interaction and their implications for the design of multimodal interfaces

  • Luca Chittaro
Original Paper

Abstract

People want to do more with their mobile phones, but their desire is frustrated by two classes of limitations. One is related to the device, its hardware and software. The other is related to the context, and comprises perceptual, motor, cognitive and social aspects. This paper will discuss some of the opportunities and challenges that this complex scenario presents to multimodality, which can be a key factor for a better design of mobile interfaces to help people do more on their mobile phones, requiring less time and attention.

Keywords

Mobile devices Multimodal interfaces Attention Cognitive workload mitigation Human factors Human-computer interaction 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Brown ID (1994) Driver fatigue. Hum Factors 36(2):298–314 Google Scholar
  2. 2.
    Burigat S, Chittaro L, Gabrielli S (2008) Navigation techniques for small-screen devices: an evaluation on maps and web pages. Int J Hum-Comput Stud 66(2):78–97 CrossRefGoogle Scholar
  3. 3.
    Burigat S, Chittaro L (2008) Interactive visual analysis of geographic data on mobile devices based on dynamic queries. J Vis Lang Comput 19(1):99–122 CrossRefGoogle Scholar
  4. 4.
    Buttussi F, Chittaro L (2008) MOPET: a context-aware and user-adaptive wearable system for fitness training. Artif Intell Med J 42(2):153–163 CrossRefGoogle Scholar
  5. 5.
    Burigat S, Chittaro L, Ieronutti L (2008) Mobrex: visualizing users’ mobile browsing behaviors. IEEE Comput Graph Appl 28(1):24–32 CrossRefGoogle Scholar
  6. 6.
    Chittaro L, Burigat S (2005) Augmenting audio messages with visual directions in mobile guides: an evaluation of three approaches. In: Proceedings of the 7th international conference on human-computer interaction with mobile devices and services (mobile HCI). ACM Press, New York, pp 107–114 CrossRefGoogle Scholar
  7. 7.
    Chittaro L, Ranon R, Ieronutti L (2006) VU-Flow: a visualization tool for analyzing navigation in virtual environments. IEEE Trans Vis Comput Graph 12(6):1475–1485 CrossRefGoogle Scholar
  8. 8.
    Cherubini M, Anguera X, Oliver N, de Oliveoira R (2009) Text versus speech: a comparison of tagging input modalities for camera phones. In: Proceedings of the 11th international conference on human-computer interaction with mobile devices and services (mobile HCI). ACM Press, New York, pp 1–10 CrossRefGoogle Scholar
  9. 9.
    Cooper PJ, Zheng Y, Richard C, Vavrik J, Heinrichs B, Siegmund G (2003) The impact of hands-free message reception/response on driving task performance. Accid Anal Prev 35:23–35 CrossRefGoogle Scholar
  10. 10.
    Cox AL, Cairns PA, Walton A, Lee S (2008) Tlk or txt? Using voice input for SMS composition. Pers Ubiquitous Comput 12:567–588 CrossRefGoogle Scholar
  11. 11.
    Ghiani G, Leporini B, Paternò F (2008) Vibrotactile feedback as an orientation aid for blind users of mobile guides. In: Proceedings of the 10th international conference on human-computer interaction with mobile devices and services (mobile HCI). ACM Press, New York, pp 431–434 CrossRefGoogle Scholar
  12. 12.
    Green P (2000) Crashes induced by driver information systems and what can be done to reduce them (SAE paper 2000-01-C008). In: Convergence 2000 conference proceedings, Society of Automotive Engineers, Warrendale, PA, pp 26–36 Google Scholar
  13. 13.
    Hancock PA, Lesch M, Simmons L (2003) The distraction effects of phone use during a crucial driving maneuver. Accid Anal Prev 35:501–514 CrossRefGoogle Scholar
  14. 14.
    Holland, Morse, Gedenryd (2002) AudioGPS: spatial audio navigation with a minimal attention interface. Personal Ubiquitous Comput 6(4):253–259 CrossRefGoogle Scholar
  15. 15.
    Hoggan E, Kaaresoja T, Laitinen P, Brewster S (2008) Crossmodal congruence: the look, feel and sound of touchscreen widgets. In: Proceedings of the international conference on multimodal interfaces. ACM Press, New York, pp 157–164 CrossRefGoogle Scholar
  16. 16.
    Hoggan E, Crossan A, Brewster S, Kaaresoja T (2009) Audio or tactile feedback: which modality when? In: Proceedings of the 27th international conference on human factors in computing systems. ACM Press, New York, pp 2253–2256 CrossRefGoogle Scholar
  17. 17.
    Lesch MF, Hancock PA (2004) Driving performance during concurrent cell-phone use: are drivers aware of their performance decrements? Accid Anal Prev 36:471–480 CrossRefGoogle Scholar
  18. 18.
    Leung R, Maclean K, Bertelsen MB, Saubhasik M (2007) Evaluation of haptically augmented touchscreen GUI elements under cognitive load. In: Proceedings of the international conference on multimodal interfaces. ACM Press, New York, pp 374–381 CrossRefGoogle Scholar
  19. 19.
    Murray-Smith R, Williamson J, Hughes S, Quaade T (2008) Stane: synthesized surfaces for tactile input. In: Proceedings of the 26th SIGCHI conference on human factors in computing systems. ACM Press, New York, pp 1299–1302 Google Scholar
  20. 20.
    Nadalutti D, Chittaro L (2007) Visual analysis of users’ performance data in fitness activities. Comput Graph 31(3):429–439. Special issue on visual analytics CrossRefGoogle Scholar
  21. 21.
    Oulasvirta A, Tamminen S, Roto V, Kuorelahti J (2005) Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI. In: Proceedings of the 23th SIGCHI conference on human factors in computing systems. ACM Press, New York, pp 919–928 Google Scholar
  22. 22.
    Pascoe J, Ryan N, Morse D (2000) Using while moving: HCI issues in fieldwork environments. ACM Trans Comput-Hum Interact 7(3):417–437 CrossRefGoogle Scholar
  23. 23.
    Perakakis M, Potamianos A (2004) A study in efficiency and modality usage in multimodal form filling systems. IEEE Trans Audio Speech Lang Process 16:1194–1206 CrossRefGoogle Scholar
  24. 24.
    Roto V, Oulasvirta A (2005) Need for non-visual feedback with long response times in mobile HCI. In: Proceedings of the 14th international conference on the world wide web (WWW). ACM Press, New York, pp 775–781 CrossRefGoogle Scholar
  25. 25.
    Serrano M, Ramsay A, Nigay L, Murray-Smith R, Lawson JL, Denef S (2008) The OpenInterface framework: a tool for multimodal interaction. In: Proceedings of the 26th SIGCHI conference on human factors in computing systems. ACM Press, New York, pp 3501–3506 Google Scholar
  26. 26.
    Szalma JL, Hancock PA (2008) Task loading and stress in human-computer interaction: theoretical frameworks and mitigation strategies. In: Human computer interaction handbook, 2nd edn. Lawrence Erlbaum Associates, Hillsdale, pp 115–132 Google Scholar
  27. 27.
    Van Erp JBF, Van Veen HAHC, Jansen C, Dobbins T (2005) Waypoint navigation with a vibrotactile waist belt. ACM Trans Appl Percept 2(2):106–117 CrossRefGoogle Scholar
  28. 28.
    Williamson J, Murray-Smith R, Hughes S (2007) Shoogle: Multimodal excitatory interfaces on mobile devices. In: Proceedings of the 25th SIGCHI conference on human factors in computing systems. ACM Press, New York, pp 121–124 CrossRefGoogle Scholar
  29. 29.
    Zhao S, Dragicevic P, Chignell M, Balakrishnan R, Baudisch P (2007) Earpod: eyes-free menu selection using touch input and reactive audio feedback. In: Proceedings of the 25th SIGCHI conference on human factors in computing systems. ACM Press, New York, pp 1395–1404 CrossRefGoogle Scholar

Copyright information

© OpenInterface Association 2010

Authors and Affiliations

  1. 1.HCI Lab, Dept. of Math and Computer ScienceUniversity of UdineUdineItaly

Personalised recommendations