Journal on Multimodal User Interfaces

, Volume 4, Issue 1, pp 3–10 | Cite as

Multi-touch user interface evaluation for 3D object manipulation on mobile devices

Original Paper

Abstract

Multi-touch user interfaces (MTUIs) can represent a valuable tool for enhancing human-machine interaction. The naturalness and variety of hand gestures offer a more direct and intuitive form of interaction that can be exploited in a large spectrum of applications. Moreover, 3D visualization on mobile devices is a task required by increasing number of scenarios ranging from video games to engineering. In this paper, the impact of MTUIs when applied to 3D scene navigation on handheld devices is investigated. Numerical measures as well as subjective results (based on user feedback) are exploited in order to analyze the effectiveness of MTUIs compared with a traditional button-based user interface. The two GUIs are compared in terms of time and number of interactions to complete three reference tests. In particular, a statistical analysis based on paired t-tests shows how the proposed MTUI can, in general, outperform a traditional button GUI; differences between GUIs are strongly reduced when a fine control of objects is required.

Multi-touch interfaces Mobile devices Interaction style 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Courtemanche AA, Micire M, Holly Y (2007) Human-robot interaction using a multi-touch display. In: Proceedings of the 2nd annual IEEE tabletop workshop, poster presentation. IEEE, New York Google Scholar
  2. 2.
    Benko H, Wilson AD, Balakrishnan R (2008) Sphere: multi-touch interactions on a spherical display. In: UIST ’08: proceedings of the 21st annual ACM symposium on user interface software and technology. ACM, New York, pp 77–86 CrossRefGoogle Scholar
  3. 3.
    Bredies K, Mann NA, Ahrens J, Geier M, Spors S, Nischt M (2008) The multi-touch soundscape renderer. In: AVI ’08: proceedings of the working conference on advanced visual interfaces. ACM, New York, pp 466–469 CrossRefGoogle Scholar
  4. 4.
    Davidson PL, Han JY (2006) Synthesis and control on large scale multi-touch sensing displays. In: NIME ’06: proceedings of the 2006 conference on new interfaces for musical expression, Paris, France, pp 216–219. Google Scholar
  5. 5.
    Dietz P, Leigh D (2001) Diamondtouch: a multi-user touch technology. In: UIST ’01: proceedings of the 14th annual ACM symposium on user interface software and technology. ACM, New York, pp 219–226 CrossRefGoogle Scholar
  6. 6.
    Echtler F, Klinker G (2008) A multitouch software architecture. In: NordiCHI ’08: proceedings of the 5th Nordic conference on human-computer interaction. ACM, New York, pp 463–466 CrossRefGoogle Scholar
  7. 7.
    Fisher Box J (1987) Guinness, Gosset, Fisher, and Small samples. Stat Sci 2(1):45–52 CrossRefGoogle Scholar
  8. 8.
    Goose S, Güven S, Zhang X, Sudarsky S, Navab N (2003) Mobile 3D visualization and interaction in an industrial environment. In: Proceedings of the 10th international conference on human-computer interaction (HCI), pp 379–383 Google Scholar
  9. 9.
    Gross T, Fetter M, Liebsch S (2008) The cuetable: cooperative and competitive multi-touch interaction on a tabletop. In: CHI ’08: proceedings of the SIGCHI conference on human factors in computing systems, extended abstract. ACM, New York, pp 3465–3470 CrossRefGoogle Scholar
  10. 10.
    Grossman T, Wigdor D (2007) Going deeper: a taxonomy of 3d on the tabletop. In: TABLETOP ’07: proceedings of the 2nd annual IEEE international workshop on horizontal interactive human-computer systems, pp 137–144 Google Scholar
  11. 11.
    Hafeneger S, Weiss M, Herkenrath G, Borchers J (2008) Pockettable: Mobile devices as multi-touch controllers for tabletop application development. In: Extended abstracts of tabletop ’08 Google Scholar
  12. 12.
    Han JY (2005) Low-cost multi-touch sensing through frustrated total internal reflection. In: UIST ’05: proceedings of the 18th annual ACM symposium on user interface software and technology. ACM, New York, pp 115–118 CrossRefGoogle Scholar
  13. 13.
    Han JY (2006) Multi-touch interaction wall. In: SIGGRAPH ’06: ACM SIGGRAPH 2006 emerging technologies. ACM, New York, p 25 CrossRefGoogle Scholar
  14. 14.
    Hancock M, Carpendale S, Cockburn A (2007) Shallow-depth 3D interaction: design and evaluation of one-, two and three-touch techniques. In: CHI ’07: proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York, pp 1147–1156 CrossRefGoogle Scholar
  15. 15.
    Hilli W (1982) A high resolution imaging touch sensor. Int J Robot Res 1(2):33–44 CrossRefGoogle Scholar
  16. 16.
    Initiative EL (2008) 7 things you should know about multi-touch interfaces. http://net.educause.edu/ir/library/pdf/ELI7037.pdf
  17. 17.
    Kristensson PO, Arnell O, Björk A, Dahlbäck N, Pennerup J, Prytz E, Wikman J, Åström N (2008) Infotouch: an explorative multi-touch visualization interface for tagged photo collections. In: NordiCHI ’08: proceedings of the 5th Nordic conference on human-computer interaction. ACM, New York, pp 491–494 CrossRefGoogle Scholar
  18. 18.
    Jung Y, Keil J et al. (2008) Adapting X3D for Multi-touch Environments. In: Proceedings of WEB3D’ 08. ACM, Los Angeles, pp 27–30, 122 CrossRefGoogle Scholar
  19. 19.
    Lee S, Buxton W, Smith KC (1985) A multi-touch three dimensional touch-sensitive tablet. SIGCHI Bull 16(4):21–25 CrossRefGoogle Scholar
  20. 20.
    Micire M, Drury JL, Keyes B, Yanco HA (2008) Multi-touch interaction for robot control. In: IUI ’09: proceedings of the 13th international conference on intelligent user interfaces. ACM, New York, pp 425–428 CrossRefGoogle Scholar
  21. 21.
    Microsoft (2007) Microsoft surface. http//www.microsoft.com/surface/
  22. 22.
    Moscovich T (2006) Multi-touch interaction. In: CHI ’06: proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York, pp 1775–1778 CrossRefGoogle Scholar
  23. 23.
    Nestler S, Echtler F, Dollinger A, Klinker G (2008) Collaborative problem solving on mobile hand-held devices and stationary multi-touch interfaces. In: PPD’08. workshop on designing multi-touch interaction techniques for coupled public and private displays Google Scholar
  24. 24.
    NextWindow (2008) Nextwindow multitouch. http//www.nextwindow.com/
  25. 25.
    Pavlović VI, Sharma R, Huang TS (1996) Invited speech: gestural interface to a visual computing environment for molecular biologists. In: FG ’96: proceedings of the 2nd international conference on automatic face and gesture recognition. IEEE Computer Society, Los Alamitos, pp 52–73 Google Scholar
  26. 26.
    Pavlović VI, Sharma R, Huang TS (1997) Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans Pattern Anal Mach Intell 19:677–695 CrossRefGoogle Scholar
  27. 27.
    Rekimoto J (2002) Smartskin: an infrastructure for freehand manipulation on interactive surfaces. In: CHI ’02: proceedings of the SIGCHI conference on human factors in computing systems, extended abstract. ACM, New York, pp 113–120 CrossRefGoogle Scholar
  28. 28.
    de la Rivière J-B, Kervégant C, Orvain E, Dittlo N (2008) CubTile: a multi-touch cubic interface. In: VRST’08: proceedings of the 15th ACM symposium on virtual reality software and technology. ACM, New York, pp 69–72 Google Scholar
  29. 29.
    Selker T (2008) Touching the future. Commun ACM 51:14–16 CrossRefGoogle Scholar
  30. 30.
    Shen EL, Tsai SS, Chu HH, Hsu J, Chen CW (2009) Double-side multi-touch input for mobile devices. In: CHI ’09: proceedings of the SIGCHI conference on human factors in computing systems, extended abstract. ACM, New York Google Scholar
  31. 31.
    Shizuki B, Naito M, Tanaka J (2008) Browsing 3d media using cylindrical multi-touch interface. In: International symposium on multimedia, pp 489–490 Google Scholar
  32. 32.
    Westerman W, Elias J, Hedge A (2001) Multi-touch: a new tactile 2-d gesture interface for human-computer interaction. In: Proceedings of the human factors and ergonomics society 45th annual meeting, pp 632–636 Google Scholar
  33. 33.
    Wigdor D, Leigh D, Forlines C, Shipman S, Barnwell J, Balakrishnan R, Shen C (2006) Under the table interaction. In: UIST ’06: proceedings of the 19th annual ACM symposium on user interface software and technology. ACM, New York, pp 259–268 CrossRefGoogle Scholar
  34. 34.
    Wilson AD, Izadi S, Hilliges O, Garcia-Mendoza A, Kirk D (2008) Bringing physics to the surface. In: UIST ’08: proceedings of the 21st annual ACM symposium on user interface software and technology. ACM, New York, pp 67–76 CrossRefGoogle Scholar
  35. 35.
    Wright A (2009) Making sense of sensors. Commun ACM 52:14–15 Google Scholar

Copyright information

© OpenInterface Association 2009

Authors and Affiliations

  • Donato Fiorella
    • 1
  • Andrea Sanna
    • 2
  • Fabrizio Lamberti
    • 2
  1. 1.CSP s.c. a r.l.TorinoItaly
  2. 2.Dipartimento di Automatica e InformaticaPolitecnico di TorinoTorinoItaly

Personalised recommendations