Advertisement

Usability Study of Multi-modal Interfaces Using Eye-Tracking

  • Regina Bernhaupt
  • Philippe Palanque
  • Marco Winckler
  • David Navarre
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4663)

Abstract

The promises of multimodal interaction to make interaction more natural, less error-prone and more enjoyable have been controversially discussed in the HCI community. On the one hand multimodal interaction is being adopted in fields ranging from entertainment to safety-critical applications, on the other hand new forms of interaction techniques (including two-handed interaction and speech) are still not in widespread use. In this paper we present results from a usability evaluation study including eye-tracking on how two mice and speech interaction is adopted by the users. Our results show evidence that two mice and speech can be adopted naturally by the users. In addition, we discuss how eye-tracking data helps to understand advantages of two-handed interaction and speech.

Keywords

Multimodal interfaces usability evaluation method two mice speech 

References

  1. 1.
    Bastide, R., Navarre, D., Palanque, P., Schyn, A., Dragicevic, P.: Model-Based Approach for Real-Time Embedded Multimodal Systems in Military Aircrafts. In: Proc. ICMI 2004, pp. 243–250. ACM Press, New York (2004)Google Scholar
  2. 2.
    Baille, L., Schatz, R.: Exploring Multimodality in the Laboratory and the Field. In: Lazzari, G., Pianesi, P. (eds.) ACM international conference on Multimodal Interfaces (ICMI’2005), pp. 100–107. ACM Press, New York (2005)Google Scholar
  3. 3.
    Balbo, S., Coutaz, J., Salber, D.: Towards Automatic Evaluation of Multimodal User Interfaces. Intelligent User Interfaces, Knowledge-Based Systems 6(4), 267–274 (2003)Google Scholar
  4. 4.
    Bernhaupt, R., Palanque, P., Winkler, M., Navarre, D.: Supporting Usability Evaluation of Multimodal Safety Critical Interactive Applications using Dialogue and Interaction Models. In: Law, E., et al. (eds.) Maturing Usability: Quality in Software, Interaction and Value, Springer, Heidelberg (2006)Google Scholar
  5. 5.
    Bolt, R.A.: Put-That-There: Voice and Gesture at the Graphics Interface. In: Proceedings of the 7th International Conference on Computer Graphics and Interactive Techniques, Seattle, pp. 262–270 (1980)Google Scholar
  6. 6.
    Bolt, R.E., Herranz, E.: Two-handed gesture in multi-modal natural dialog. In: Mackinlay, J., Green, M. (eds.) Symposium on User Interface Software and Technology (UIST’92), pp. 7–14. ACM Press, New York (1992)Google Scholar
  7. 7.
    Bowman, D., Gabbard, J., Hix, D.: A Survey of Usability Evaluation in Virtual Environments: Classification and Comparison of Methods. Presence: Teleoperators and Virtual Environments 11(4), 404–424 (2002)CrossRefGoogle Scholar
  8. 8.
    Buxton, W., Myers, B.A.: A study in two-handed input. In: Mantei, M., Orbeton, P. (eds.) ACM Conference on Human Factors in Computing Systems (CHI 1986), pp. 321–326. ACM Press, Boston, Massachusetts (1986)Google Scholar
  9. 9.
    Dillon, R.F., Edey, J.D., Tombaugh, J.W.: Measuring the true cost of command selection: techniques and results. In: Chew, J.C, Whiteside, J. (eds.) ACM Conference on Human Factors in Computing Systems (CHI’90), pp. 19–25. ACM Press, Seattle, Washington (1990)Google Scholar
  10. 10.
    Dybkjær, L., Bernsen, N.O., Minker, W.: New Challenges in Usability Evaluation - Beyond Task-Oriented Spoken Dialogue Systems. In: Proceedings of ICSLP, vol. III, pp. 2261–2264 (2004)Google Scholar
  11. 11.
    Godding, T. (2006) Eye tracking vs. Thinking aloud? (Retrieved December 15th, 2006), from http://staff.interesource.com/priority4/octover2006/eyetracking.htm
  12. 12.
    Goldberg, H.J., Kotval, X.P.: Computer interface evaluation using eye movements: Methods and constructs. International Journal of Industrial Ergonomics 24, 631–645 (1999)CrossRefGoogle Scholar
  13. 13.
    Goldberg, H.J., Wichansky, A.M.: Eye tracking in usability evaluation: A practicioner’s guide. In: HyÖna, J., Radach, R., Deubel, H. (eds.) The mind’s eye: Cognitive and applied aspects of eye movements research, pp. 493–516. Elsevier, Amsterdam (2003)CrossRefGoogle Scholar
  14. 14.
    Hassenzahl, M.: The Thing and I: understanding the Relationship Between User and Product. In: Blythe, M., Overbeeke, C., Monk, A., Wright, P. (eds.) Funology: From Usability to Enjoyment, pp. 31–42. Kluwer Academic Publishers, Dordrecht (2003)CrossRefGoogle Scholar
  15. 15.
    Hinkley, K., Czerwinsky, M., Sinclair, M.: Interaction and Modeling Techniques for Desktop Two-Handed Input. In: Proc. IUI 1998, pp. 49–59. ACM Press, New York (1998)Google Scholar
  16. 16.
    Hinckley, K., Pausch, R., Proffitt, D., Kassel, N.F.: Two-handed virtual manipulation. ACM Transactions on Computer-Human Interaction 5(3), 260–302 (1998)CrossRefGoogle Scholar
  17. 17.
    Jacob, R.J.K., Karn, K.S.: Eye tracking in human-computer interaction and usability research: Ready to deliver the promises (Section commentary). In: Hyona, J. (ed.) The Mind’s Eyes: Cognitive and Applied Aspects of Eye Movements, Elsevier Science, Oxford (2003)Google Scholar
  18. 18.
    Jöst, M., Haubler, J., Merdes, M., Malaka, R.: Multimodal Interaction for pedestrians: an evaluation study. In: Riedl, J., Jameson, A., Billsus, D., Lau, T. (eds.) ACM International Conference on Intelligent User Interfaces (IUI’2005), San Diego, Unite State, pp. 59–66. ACM Press, New York (2005)Google Scholar
  19. 19.
    Kabbash, P., Buxton, W., Sellen, A.: Two-handed input in a compound task. In: Plaisant, C. (ed.) ACM Conference on Human Factors in Computing System (CHI’94), pp. 417–423. ACM Press, Boston, Massachusetts (1994)Google Scholar
  20. 20.
    Kaster, T., Pfeiffer, M., Bauckhage, C.: Combining Speech and Haptics for Intuitive and Efficient Navigation through Image Database. In: Oviatt, S. (ed.) ACM International Conference on Multimodal interfaces (ICMI’2003), pp. 180–187. ACM Press, New York (2003)Google Scholar
  21. 21.
    Kjeldskov, J., Stage, J.: New Techniques for Usability Evaluation of mobile systems. International Journal on Human-Computer Studies 60(5), 599–620 (2004)CrossRefGoogle Scholar
  22. 22.
    Navarre, D., Palanque, P., Dragicevic, P., Bastide, R.: An Approach Integrating two Complementary Model-based Environments for the Construction of Multimodal Interactive Applications. Interacting with Computers 18(5), 910–941 (2006)CrossRefGoogle Scholar
  23. 23.
    Oviatt, S.: Ten myths of multimodal interaction. Communications of the ACM 42(11), 74–81 (1999)CrossRefGoogle Scholar
  24. 24.
    Palanque, P., Bernhaupt, R., Navarre, D., Ould, M., Rubio, R.: Model-based Measurement of the Usability of Multimodal Man-Machine Interfaces for Space Ground Segment Applications. In: Proceedings of SpaceOps 2006, AAAI Press, Stanford, California (2006)Google Scholar
  25. 25.
    Poole, A., Ball, L.J.: Eye Tracking in Human-Computer Interaction and Usability Research: Current Status and Future Prospects. In: Ghaoui, C. (ed.) Encyclopedia of Human Computer Interaction, Idea Group, USA (2004)Google Scholar
  26. 26.
    Reason, J.: Human Error. Cambridge University Press, Cambridge (1990)CrossRefGoogle Scholar
  27. 27.
    Salber, D., Coutaz, J.: A Wizard of Oz Platform for the Study of Multimodal Systems. In: Interchi 1993, pp. 95–96. ACM Press, New York (1993)Google Scholar
  28. 28.
    Zhai, S., Barton, A.S., Selker, T.: Improving browsing performance: a study of four input devices for scrolling and pointing tasks. In: Howard, S., Hammond, J., Lindgaard, G. (eds.) The IFIP Conference on Human-Computer Interaction (INTERACT’97), pp. 286–292. Chapman & Hall, Sydney, Australia (1997)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2007

Authors and Affiliations

  • Regina Bernhaupt
    • 1
  • Philippe Palanque
    • 2
  • Marco Winckler
    • 2
  • David Navarre
    • 2
  1. 1.ICT&S Center, Universität Salzburg, Sigmund-Haffner-Gasse 18, 5020 SalzburgAustria
  2. 2.IRIT, Université Toulouse IIIFrance

Personalised recommendations