Advertisement

Personal and Ubiquitous Computing

, Volume 18, Issue 5, pp 1243–1257 | Cite as

Touch and gesture: mediating content display, inscriptions, and gestures across multiple devices

  • Gerard Oleksik
  • Natasa Milic-Frayling
  • Rachel Jones
Original Article

Abstract

Recent advances in computer design and technology have broadened the range of devices enabled for inscription and touch-based interaction and increased their adoption in collaborative work settings. Since most of the past research has focused on optimal use of individual devices, we now need to expand our understanding of how these devices are used in concert, particularly in collaborative work settings where touch and gesture facilitate communication and may interfere with the touch-based input. We conducted in situ observations of team meetings that involve the use of a tabletop computer, tablet personal computers (tablet PCs) with handwriting support, and a vertical display. The study shows how inscriptions and gestures naturally emerge around the content displayed on the devices and how important it is to maintain their spatial congruence. Furthermore, it reveals that the combination of the tablet PCs and the tabletop computer encourages the use of gestures and touch across devices as part of sense-making. While discussing the content, the users apply sequential and synchronous gestures to bind content and inscriptions across devices. Our observations of binding gestures extend the gesture taxonomies from previous studies and expand the notion of multi-touch beyond individual devices. We stipulate that designing support for touch and gestures across devices requires a holistic approach. Only through coordinated design of touch, inscription, and gesture input and consideration of broader usage scenarios, we can ensure minimal interference with naturally emerging touch and gestures and provide effective mechanism for disambiguating general user behavior from device input actions.

Keywords

Gesture Inscription Touch Tabletop Tablets Binding gestures 

References

  1. 1.
    Andreychuk D, Ghanam Y, Maurer F (2010) Adapting existing applications to support new interaction technologies: technical and usability issues. In: Proceedings of the 2nd ACM SIGCHI symposium on engineering interactive computing systems (EICS ‘10). ACM, New York, NY, USA, pp 199–204Google Scholar
  2. 2.
    Bekker MM, Olson JS, Olson GM (1995) Analysis of gestures in face-to-face design teams provides guidance for how to use groupware in design. In: Proceedings of the conference on designing interactive systems, pp 157–166Google Scholar
  3. 3.
    Biehl JT, Baker WT, Bailey BP, Tan DS, Inkpen KM, Czerwinski M (2008) Impromptu: a new interaction framework for supporting collaboration in multiple display environments and its field evaluation for co-located software development. In: Proceeding of SIGCHI conference on human factors in computing systemsGoogle Scholar
  4. 4.
    Cortina LJ, Zhao Q, Cobb P, McClain K (2003) Supporting students’ reasoning with inscriptions. In: Proceedings of 6th international conference of learning science, pp 124–149Google Scholar
  5. 5.
    Dachselt R, Buchholz R (2009) Natural throw and tilt interaction between mobile phones and distant displays. In: Proceedings of the 27th international conference extended abstracts on human factors in computing systems 2009 (CHI ‘09). ACM, pp 3253–3258Google Scholar
  6. 6.
    Goodwin C (2003) Pointing as situated practice. In: Kito S (ed) Pointing: where language, culture, and cognition meet. Lawrence Erlbaum Associates, Inc., Mahwah, NJ, pp 217–242Google Scholar
  7. 7.
    Ha V, Inkpen KM, Whalen T, Mandryk RL (2006) Direct intentions: the effects of input devices on collaboration around a tabletop display. In: Proceedings of the first IEEE international workshop on horizontal interactive human-computer systems, pp 177–184Google Scholar
  8. 8.
    Hawkey K, Kellar M, Reilly D, Whalen T, Inkpen KM (2005) The proximity factor: impact of distance on co-located collaboration. In: Proceedings of the 2005 international ACM SIGGROUP conference on supporting group workGoogle Scholar
  9. 9.
    Hardy R, Rukzio E (2008) Touch & interact: touch-based interaction of mobile phones with displays. In: Proceedings of the 10th international conference on human computer interaction with mobile devices and services, Amsterdam, The NetherlandsGoogle Scholar
  10. 10.
    Hinckley K, Ramos G, Guimbretiere F, Baudisch P, Smith M (2004) Stitching: pen gestures that span multiple displays. In: Proceedings of the working conference on advanced visual interfaces, Gallipoli, ItalyGoogle Scholar
  11. 11.
    Hinckley K (2003) Synchronous gestures for multiple persons and computers. In: Proceedings of the 16th annual ACM symposium on user interface software and technology, pp 149–158Google Scholar
  12. 12.
    Izadi S, Agarwal A, Criminisi A, Winn J, Blake A, Fitzgibbon A (2007) C-Slate: exploring remote collaboration on horizontal multi-touch surfaces. Proc IEEE Tabletop 2007:3–10Google Scholar
  13. 13.
    Izadi S, Hodges S, Taylor S, Rosenfeld D, Villar N, Butler A, Westhues J (2008) Going beyond the display: a surface technology with an electronically switchable diffuser. Proc ACM UIST 08(269–278):2008Google Scholar
  14. 14.
    Seifert J, Simeone A, Schmidt D, Holleis P, Reinartz C, Wagner M, Gellersen H, Rukzio E (2012) MobiSurf: improving co-located collaboration through integrating mobile devices and interactive surfaces. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces (ITS ‘12). ACM, New York, NY, USAGoogle Scholar
  15. 15.
    Kendon A (1988) How gestures can become like words. In: Potyatos F (ed) Crosscultural perspectives in nonverbal communication. Hogrefe, Toronto, pp 131–141Google Scholar
  16. 16.
    O’Hara K, Harper R, Mentis H, Sellen A, Taylor A (2013) On the naturalness of touchless: putting the “interaction” back into NUI. ACM Trans Comput Hum Interact 20(1):1–25Google Scholar
  17. 17.
    Kirk D, Crabtree A, Rodden T (2005) Ways of the hands. In: Proceedings of the ninth conference on european conference on computer supported cooperative work, September 18–22, 2005, Paris, France, pp 1–21Google Scholar
  18. 18.
    Kray C, Nesbitt D, Dawson J, Rohs M (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services, Lisbon, PortugalGoogle Scholar
  19. 19.
    Lee H, Jeong H, Lee J, Yeom K, Park J (2009) Gesture-based interface for connection and control of multi-device in a tabletop display environment. In: Proceedings of the 13th international conference on human-computer interaction, pp 216–225Google Scholar
  20. 20.
    McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, ChicagoGoogle Scholar
  21. 21.
  22. 22.
    Microsoft Kinect for Xbox. http://www.xbox.com/en-GB/KINECT
  23. 23.
    Oleksik G, Milic-Frayling N, Jones R (2012) Beyond data sharing: artifact ecology of a collaborative nanophotonics research centre. In: Proceedings of CSCW ‘12. ACM, New York, NY, USA, pp 1165–1174Google Scholar
  24. 24.
    Olsen J, Olsen G (2004) Why distances matters. Hum Comput Interact 15:139–179CrossRefGoogle Scholar
  25. 25.
    Peng C, Shen G, Zhang Y, Lu S (2009) Point&Connect: intention-based device pairing for mobile phone users. ACM/USENIX MobiSys, KrakówGoogle Scholar
  26. 26.
    Rekimoto J (1997) Pick-and-drop: a direct manipulation technique for multiple computer environments. In: Proceedings of UIST 1999, pp 31–39Google Scholar
  27. 27.
    Siri, Apple Corporation. http://www.apple.com/ios/siri
  28. 28.
    Shaer O, Kol G, Strait M, Fan C, Grevet C, Elfenbein S (2010) G-nome surfer: a tabletop interface for collaborative exploration of genomic data. ACM CHI 2010, AtlantaGoogle Scholar
  29. 29.
    Smalheiser NR, Torvik VI, Zhou W (2009) Arrowsmith two-node search interface: a tutorial on finding meaningful links between two disparate sets of articles in MEDLINE. Comput Methods Programs Biomed 94(2):190–197CrossRefGoogle Scholar
  30. 30.
    Streeck J, Kallmeyer W (2004) Interaction by inscription. J Pragmat 33(4):465–490CrossRefGoogle Scholar
  31. 31.
    Streitz NA, Geißler J, Holmer T, Konomi S, Müller-Tomfelde C, Reischl W, Rexroth P, Seitz P, Steinmetz R (1999) i-LAND: an interactive landscape for creativity and innovation. In: Proceedings of the SIGCHI conference on human factors in computing systems, May 15–20, 1999, Pittsburgh, Pennsylvania, United States, pp 120–127Google Scholar
  32. 32.
    Tabard A, Eastmond E, Mackay WE (2008) From individual to collaborative: the evolution of prism, a hybrid laboratory notebook. CSCW’08, November 8–12, 2008, San Diego, California, USAGoogle Scholar
  33. 33.
    Yatani K, Tamura K, Hiroki K, Sugimoto M, Hashizume H (2006) Toss-it: intuitive information transfer techniques for mobile devices using toss and swing actions. IEICE Trans Inf Syst 89:150–157CrossRefGoogle Scholar
  34. 34.
    Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the 27th international conference on Human factors in computing systems CHI 09, pp 1083–1092Google Scholar

Copyright information

© Springer-Verlag London 2013

Authors and Affiliations

  • Gerard Oleksik
    • 1
  • Natasa Milic-Frayling
    • 2
  • Rachel Jones
    • 3
  1. 1.Dovetailed LtdCambridgeUK
  2. 2.Microsoft Research LtdCambridgeUK
  3. 3.Instrata LtdCambridgeUK

Personalised recommendations