Advertisement

A Toolkit for Multimodal Interface Design: An Empirical Investigation

  • Dimitrios Rigas
  • Mohammad Alsuraihi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4552)

Abstract

This paper introduces a comparative multi-group study carried out to investigate the use of multimodal interaction metaphors (visual, oral, and aural) for improving learnability (or usability from first time use) of interface-design environments. An initial survey was used for taking views about the effectiveness and satisfaction of employing speech and speech-recognition for solving some of the common usability problems. Then, the investigation was done empirically by testing the usability parameters: efficiency, effectiveness, and satisfaction of three design-toolkits (TVOID, OFVOID, and MMID) built especially for the study. TVOID and OFVOID interacted with the user visually only using typical and time-saving interaction metaphors. The third environment MMID added another modality through vocal and aural interaction. The results showed that the use of vocal commands and the mouse concurrently for completing tasks from first time use was more efficient and more effective than the use of visual-only interaction metaphors.

Keywords

interface-design usability learnability effectiveness efficiency satisfaction visual oral aural multimodal auditory-icons earcons speech text-to-speech speech recognition voice-instruction 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Beaudouin-Lafon, M., Conversy, S.: Auditory illusions for audio feedback. In: ACM CHI 1996, Vancouver, Canada (1996)Google Scholar
  2. 2.
    Brewster, S., Clarke, C.V.: The design and evaluation of a sonically-enhanced tool palette. In: ICAD 1997, Xerox PARC, USA (1997)Google Scholar
  3. 3.
    Chang, C., Chen, G., Liu, B., Ou, K.: A language for developing collaborative learning activities on World Wide Web. In: Proceedings of 20th International Computer Software and Applications Conference, COMPSAC 1996 (1996)Google Scholar
  4. 4.
    Cronly-Dillon, J., Persaud, K.C., Blore, R.: Blind subjects construct conscious mental images of visual scenes encoded in musical form. In: The Royal Society Of London Series B-Biological Sciences, London (2000)Google Scholar
  5. 5.
    Cvetkovic, S.R., Seebold, R.J.A., Bateson, K.N., Okretic, V.K.: CAL programs developed in advanced programming environments for teaching electrical engineering, Education. IEEE Transactions on 37, 221–227 (1994)CrossRefGoogle Scholar
  6. 6.
    Finkelstein, J., Nambu, S., Khare, R., Gupta, D.: CO-ED: a development platform for interactive patient education. In: Proceedings International Conference on Computers in Education, 2002 (2002)Google Scholar
  7. 7.
    Guercio, A., Arndt, T., Chang, S.-K.: A visual editor for multimedia application development. In: Proceedings. 22nd International Conference on Distributed Computing Systems Workshops, 2002 (2002)Google Scholar
  8. 8.
    Lumsden, J., Brewster, S., Crease, M., Gray, P.D.: Guidelines for Audio-Enhancement of Graphical User Interface Widgets. In: HCI 2002, London (2002)Google Scholar
  9. 9.
    Meijer, P.: Seeing with sound for the blind: Is it vision? In: Tucson 2002, Tucson, Arizona (2002)Google Scholar
  10. 10.
    Meijer, P.: Vision Technology for the Totally Blind, vol. 2004. Peter Meijer (2004)Google Scholar
  11. 11.
    Oakley, I., McGee, M.R., Brewster, S., Gray, P.D.: Putting the feel in look and feel. In: ACM CHI 2000, The Hague, NL (2000)Google Scholar
  12. 12.
    Oakley, I., Adams, A., Brewster, S., Gray, P.D.: Guidelines for the design of haptic widgets. In: BCS HCI 2002, London, UK (2002)Google Scholar
  13. 13.
    Payne, S.J., Green, T.R.G.: Task-Action Grammars: A Model of the Mental Representation of Task Languages. Human-Computer Interaction 2, 93–133 (1986)CrossRefGoogle Scholar
  14. 14.
    Petrie, H., Morley, S., McNally, P., Graziani, P.: Authoring hypermedia systems for blind people. In: IEE Colloquium on Authoring and Application of Hypermedia-Based User-Interfaces, 1995 (1995)Google Scholar
  15. 15.
    Raman, T.V.: Emacspeak-an audio desktop. In: Proceeding IEEE Compcon 1997 (1997)Google Scholar
  16. 16.
    Redeke, I.: Image and Graphic Reader. In: Proceedings 2001 International Conference on Image Processing, 2001 (2001)Google Scholar
  17. 17.
    Rigas, D.: Guidelines for Auditory Interface Design: An Empirical Investigation. In: Department of Computer Studies, University of Loughborough, Loughborough, p. 292 (1996)Google Scholar
  18. 18.
    Rigas, D., Memery, D., Yu, H.: Experiments in using structured musical sound, synthesised speech and environmental stimuli to communicate information: is there a case for integration and synergy? In: Proceedings of 2001 International Symposium on Intelligent Multimedia, Video and Speech Processing, 2001 (2001)Google Scholar
  19. 19.
    Rigas, D., Yu, H., Klearhou, K., Mistry, S.: Designing Information Systems with Audio-Visual Synergy: Empirical Results of Browsing E-Mail Data. In: Panhellenic Conference on Human-Computer Interaction. Advances on Human-Computer Interaction, Patras, Greece, 2001 (2001)Google Scholar
  20. 20.
    Rigas, D., Yu, H., Memery, D., Howden, D.: Combining Speech with Sound to Communicate Information in a Multimedia Stock Control System. In: 9th International Conference on Human-Computer Interaction: Usability Evaluation and Interface Design, New Orleans, Luisiana, USA, 2001 (2001)Google Scholar
  21. 21.
    Rigas, D., Hopwood, D., Yu, H.: The Role of Multimedia in Interfaces for On-Line Learning. In: 9th Panhellenic Conference on Informatics (PCI 2003), Thessaloniki, Greece (2003)Google Scholar
  22. 22.
    Rigas, D., Memery, D.: Multimedia e-mail data browsing: the synergistic use of various forms of auditory stimuli. In: Proceedings. ITCC 2003. International Conference on Information Technology: Coding and Computing [Computers and Communications] (2003)Google Scholar
  23. 23.
    Rigas, D.I., Memery, D.: Multimedia e-mail data browsing: the synergistic use of various forms of auditory stimuli. In: Proceedings. ITCC 2003. International Conference on Information Technology: Coding and Computing [Computers and Communications] (2003)Google Scholar
  24. 24.
    Yu, W., Kangas, K., Brewster, S.: Web-based haptic applications for blind people to create virtual graphs. In: Proceedings. 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Dimitrios Rigas
    • 1
  • Mohammad Alsuraihi
    • 1
  1. 1.School of Informatics, University of Bradford, Richmond Road, BradfordUK

Personalised recommendations