Skip to main content
Log in

Interactive design of multimodal user interfaces

Reducing technical and visual complexity

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

In contrast to the pioneers of multimodal interaction, e.g. Richard Bolt in the late seventies, today’s researchers can benefit from various existing hardware devices and software toolkits. Although these development tools are available, using them is still a great challenge, particularly in terms of their usability and their appropriateness to the actual design and research process. We present a three-part approach to supporting interaction designers and researchers in designing, developing, and evaluating novel interaction modalities including multimodal interfaces. First, we present a software architecture that enables the unification of a great variety of very heterogeneous device drivers and special-purpose toolkits in a common interaction library named “Squidy”. Second, we introduce a visual design environment that minimizes the threshold for its usage (ease-of-use) but scales well with increasing complexity (ceiling) by combining the concepts of semantic zooming with visual dataflow programming. Third, we not only support the interactive design and rapid prototyping of multimodal interfaces but also provide advanced development and debugging techniques to improve technical and conceptual solutions. In addition, we offer a test platform for controlled comparative evaluation studies as well as standard logging and analysis techniques for informing the subsequent design iteration. Squidy therefore supports the entire development lifecycle of multimodal interaction design, in both industry and research.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ballagas R, Ringel M, Stone M, Borchers J (2003) Istuff: a physical user interface toolkit for ubiquitous computing environments. In: CHI ’03: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York, pp 537–544

    Google Scholar 

  2. Ballagas R, Memon F, Reiners R, Borchers J (2007) Istuff mobile: rapidly prototyping new mobile phone interfaces for ubiquitous computing. In: CHI ’07: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York, pp 1107–1116

    Chapter  Google Scholar 

  3. Benoit A, Bonnaud L, Caplier A, Jourde F, Nigay L, Serrano M, Damousis I, Tzovaras D, Lawson J-YL (2007) Multimodal signal processing and interaction for a driving simulator: Component-based architecture. J Multimodal User Interfaces 1(1):49–58

    Article  Google Scholar 

  4. Bouchet J, Nigay L (2004) Icare: a component-based approach for the design and development of multimodal interfaces. In: CHI ’04: CHI ’04 extended abstracts on human factors in computing systems. ACM, New York, pp 1325–1328

    Google Scholar 

  5. Bouchet J, Nigay L, Ganille T (2004) Icare software components for rapidly developing multimodal interfaces. In: ICMI ’04: Proceedings of the 6th international conference on multimodal interfaces. ACM, New York, pp 251–258

    Chapter  Google Scholar 

  6. Buxton W (1983) Lexical and pragmatic considerations of input structures. SIGGRAPH Comput Graph 17(1):31–37

    Article  Google Scholar 

  7. Card SK, Mackinlay JD, Robertson GG (1991) A morphological analysis of the design space of input devices. ACM Trans Inf Syst 9(2):99–122

    Article  Google Scholar 

  8. Card SK, Newell A, Moran TP (1983) The psychology of human-computer interaction. Erlbaum Associates, Hillsdale

    Google Scholar 

  9. Foehrenbach S, König WA, Gerken J, Reiterer H (2008) Natural interaction with hand gestures and tactile feedback for large, high-res displays. In: MITH 08: Workshop on multimodal interaction through haptic feedback, held in conjunction with AVI 08: international working conference on advanced visual interfaces

  10. Gellersen H, Kortuem G, Schmidt A, Beigl M (2004) Physical prototyping with smart-its. IEEE Pervasive Comput 3(3):74–82

    Article  Google Scholar 

  11. Greenberg S, Fitchett C (2001) Phidgets: easy development of physical interfaces through physical widgets. In: UIST ’01: Proceedings of the 14th annual ACM symposium on user interface software and technology. ACM, New York, pp 209–218

    Chapter  Google Scholar 

  12. Harper R, Rodden T, Rogers Y, Sellen A (2008) Being human: human-computer interaction in the year 2020. Microsoft Research, Cambridge

    Google Scholar 

  13. Hartmann B, Abdulla L, Mittal M, Klemmer SR (2007) Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition. In: CHI ’07: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York, pp 145–154

    Chapter  Google Scholar 

  14. Herholz S, Chuang LL, Tanner TG, Blthoff HH, Fleming R (2008) Libgaze: Real-time gaze-tracking of freely moving observers for wall-sized displays. In: VMV ’08: Proceedings of the 13th international workshop on vision, modeling, and visualization

  15. Jetter H-C, Engl A, Schubert S, Reiterer H (2008) Zooming not zapping: Demonstrating the ZOIL user interface paradigm for itv applications. In: Adjunct proceedings of European interactive TV conference, Salzburg, Austria, July 3–4, 2008. Demonstration Session

  16. Jetter H-C, König WA, Reiterer H (2009) Understanding and designing surface computing with ZOIL and squidy. In: CHI 2009 workshop—multitouch and surface computing

  17. Kaltenbrunner M, Bovermann T, Bencina R, Costanza E (2005) Tuio—a protocol for table based tangible user interfaces. In: Proceedings of the 6th international workshop on gesture in human-computer interaction and simulation

  18. Kirsh D, Maglio P (1994) On distinguishing epistemic from pragmatic action. Cognitive Sci 18(4):513–549

    Article  Google Scholar 

  19. König WA, Bieg H-J, Schmidt T, Reiterer H (2007) Position-independent interaction for large high-resolution displays. In: IHCI’07: Proceedings of IADIS international conference on interfaces and human computer interaction 2007. IADIS Press, pp 117–125

  20. König WA, Böttger J, Völzow N, Reiterer H (2008) Laserpointer-interaction between art and science. In: IUI ’08: Proceedings of the 13th international conference on Intelligent user interfaces. ACM, New York, pp 423–424

    Chapter  Google Scholar 

  21. König WA, Gerken J, Dierdorf S, Reiterer H (2009) Adaptive pointing: Design and evaluation of a precision enhancing technique for absolute pointing devices. In: Interact 2009: Proceedings of the twelfth IFIP conference on human-computer interaction. Springer, Berlin, pp 658–671

    Google Scholar 

  22. Lawson J-YL, Al-Akkad A-A, Vanderdonckt J, Macq B (2009) An open source workbench for prototyping multimodal interactions based on off-the-shelf heterogeneous components. In: EICS’09: Proceedings of the first ACM SIGCHI symposium on engineering interactive computing. ACM, New York

    Google Scholar 

  23. MacKenzie IS, Ware C (1993) Lag as a determinant of human performance in interactive systems. In: CHI ’93: Proceedings of the INTERACT ’93 and CHI ’93 conference on human factors in computing systems. ACM, New York, pp 488–493

    Chapter  Google Scholar 

  24. Myers B, Hudson SE, Pausch R (2000) Past, present, and future of user interface software tools. ACM Trans Comput-Hum Interact 7(1):3–28

    Article  Google Scholar 

  25. Perlin K, Fox D (1993) Pad: an alternative approach to the computer interface. In: SIGGRAPH ’93: Proceedings of the 20th annual conference on computer graphics and interactive techniques. ACM, New York, pp 57–64

    Chapter  Google Scholar 

  26. Schlömer T, Poppinga B, Henze N, Boll S (2008) Gesture recognition with a Wii controller. In: TEI ’08: Proceedings of the 2nd international conference on Tangible and embedded interaction. ACM, New York, pp 11–14

    Chapter  Google Scholar 

  27. Serrano M, Nigay L, Lawson J-YL, Ramsay A, Murray-Smith R, Denef S (2008) The openinterface framework: a tool for multimodal interaction. In: CHI ’08: Extended abstracts on human factors in computing systems. ACM, New York, pp 3501–3506

    Google Scholar 

  28. Signer B, Norrie MC (2007) Paperpoint: a paper-based presentation and interactive paper prototyping tool. In: TEI ’07: Proceedings of the 1st international conference on tangible and embedded interaction. ACM, New York, pp 57–64

    Chapter  Google Scholar 

  29. Taylor RM, II, Hudson TC, Seeger A, Weber H, Juliano J, Helser AT (2001) VRPN: a device-independent, network-transparent VR peripheral system. In: VRST ’01: Proceedings of the ACM symposium on virtual reality software and technology. ACM, New York, pp 55–61

    Chapter  Google Scholar 

  30. Wallace VL (1976) The semantics of graphic input devices. SIGGRAPH Comput Graph 10(1):61–65

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Werner A. König.

Rights and permissions

Reprints and permissions

About this article

Cite this article

König, W.A., Rädle, R. & Reiterer, H. Interactive design of multimodal user interfaces. J Multimodal User Interfaces 3, 197–213 (2010). https://doi.org/10.1007/s12193-010-0044-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-010-0044-2

Keywords

Navigation