Advertisement

Public Systems Supporting Noninstrumented Body-Based Interaction

  • Dimitris Grammenos
  • Giannis Drossis
  • Xenophon Zabulis
Chapter
Part of the Gaming Media and Social Effects book series (GMSE)

Abstract

Body-based interaction constitutes a very intuitive way for humans to communicate with their environment but also among themselves. Nowadays, various technological solutions allow for fast and robust, noninstrumented body tracking at various levels of granularity and sophistication. This chapter studies three distinct cases showcasing different representative approaches of employing body-based interaction for the creation of public systems, in two application domains: culture and marketing. The first case is a room-sized exhibit at an archeological museum, where multiple visitors concurrently interact with a large wall projection through their position in space, as well as through the path they follow. The second example is an “advergame” used as a means of enhancing the outdoor advertising campaign of a food company. In this case, players interact with the wall-projected game world through a virtual, two-dimensional shadow of their body. Finally, the third case presents a public system for exploring timelines in both two and three dimensions that supports detailed body tracking in combination with single-hand, two-hands, and leg gestures. Design considerations are provided for each case, including related benefits and shortcomings. Additionally, findings stemming from user-based evaluations and field observations on the actual use of these systems are presented, along with pointers to potential improvements and upcoming challenges.

Keywords

Body-based interaction Body tracking Gesture-based interaction Public information systems Large displays Cultural information systems Advergames 

Notes

Acknowledgments

This work has been supported by the FORTH-ICS RTD Programme “Ambient Intelligence and Smart Environments”.

References

  1. Argyros AA, Lourakis MIA (2004) Real time tracking of multiple skin-colored objects with a possibly moving camera. In: Proceedings of the European conference on computer vision (ECCV’04), vol 3. Springer, Prague, Czech Republic, pp 368-379, 11–14 May 2004Google Scholar
  2. Blomberg J, Giacomi J, Mosher A, Swenton-Wall P (2003) Ethnographic field methods and their relation to design. In: Participatory design: principles and practices. Lawrence Erlbaum Associates, pp 123–155Google Scholar
  3. Bobick AF, Intille S S, Davis JW, Baird F, Pinhanez CS, Campbell LW, Ivanov YA, SchütteA, Wilson A (1999) The KidsRoom: a perceptually-based interactive and immersive story environment. Presence: teleoper. Virtual Environ 8(4):369–393Google Scholar
  4. Brooke J (1996) SUS: a quick and dirty usability scale. Taylor and Francis, London, pp 189–194Google Scholar
  5. Crossan A, Brewster S, Ng A (2010) Foot tapping for mobile interaction. In: Proceedings of BCS ‘10, British Computer Society Swinton, pp 418–422Google Scholar
  6. Drossis G, Grammenos D, Adami I, Stephanidis C (2013) 3D visualization and multimodal interaction with temporal information using timelines. In: proceedings of INTERACT 2013, lecture notes in computer science, vol 8119. Springer, Heidelberg, pp 214–231Google Scholar
  7. Fikkert W, van der Vet P, van der Veer G, Nijholt A (2010) Gestures for large display control. In: Proceedings of GW’09, LNCS 5934. Springer, Heidelberg, pp 245–256Google Scholar
  8. Grammenos D, Margetis G, Koutlemanis P, Zabulis X (2012) 53.090 Virtual rusks = 510 real smiles using a fun exergame installation for advertising traditional food products. In: Nijholt A, Romão T, Reidsma D (eds) Advances in computer entertainment, LNCS 7624. Springer, Heidelberg, pp 214–229Google Scholar
  9. Grønbæk K, Iversen OS, Kortbek KJ, Nielsen KR, Aagaard L (2007) iGameFloor: a platform for co-located collaborative games. Proceedings of ACE ‘07, vol 203. ACM, New York, pp 64–71CrossRefGoogle Scholar
  10. Hilliges O, Izadi S, Wilson A, Hodges S, Mendoza AG, Butz A (2009) Interactions in the air: adding further depth to interactive tabletop. In: Proceedings of UIST ‘09. ACM, New York, pp 139–148Google Scholar
  11. Jaimes A, Sebe N (2007) Multimodal human–computer interaction: a survey. Journal computer vision and image understanding archive, vol 108(Issue 1–2). ACM, New York, pp 116–134Google Scholar
  12. Jenkins H (2002) Game design as narrative architecture. In: Harrington P and Frup-Waldrop N (eds) First person. MIT Press, Cambridge, pp 118–130Google Scholar
  13. Kortbek KJ, Grønbæk K (2008) Interactive spatial multimedia for communication of art in the physical museum space. In: Proceeding of MM ‘08, pp 609–618Google Scholar
  14. Krueger MW, Gionfriddo T, Hinrichsen K (1985) VIDEOPLACE—an artificial reality. In: Proceedings of CHI’85, San Francisco, pp 35–40Google Scholar
  15. Laakso S, Laakso M (2006) Design of a body-driven multiplayer game system. Comp Entertain 4(4):7Google Scholar
  16. Lindley SE, Le Couteur J, Berthouze NL (2008) Stirring up experience through movement in game play: effects on engagement and social behaviour. In: Proceeding of CHI ‘08, pp 511–514Google Scholar
  17. Mueller F, Agamanolis S, Picard R (2003) Exertion interfaces: sports over a distance for social bonding. In: Proceedings of CHI ‘03, pp 561–568Google Scholar
  18. Nickel K, Stiefelhagen R (2003) Pointing gesture recognition based on 3D-tracking of face, hands and head orientation. In: Proceeding of ICMI ‘03. ACM, New York, pp 140–146Google Scholar
  19. Papadopoulos C, Sugarman D, Kaufmant A (2012) NuNav3D: a touch-less, body-driven interface for 3D navigation. In: Proceeding of IEEE VR 2012. IEEE, pp 67–68Google Scholar
  20. Paradiso J, Abler C, Hsiao K, Reynolds M (1997) The magic carpet: physical sensing for immersive environments. In: CHI ‘97 extended abstracts, pp 277–278Google Scholar
  21. Robertson T, Mansfield T, Loke L (2006) Designing an immersive environment for public use. Proceedings of PDC ‘06. ACM, New York, pp 31–40CrossRefGoogle Scholar
  22. Ronkainen S, Häkkilä J, Kalev S, Colley A, Linjama J (2007) Tap input as an embedded interaction method for mobile devices. In: Proceeding of TEI’07. ACM, New York, pp 263–270Google Scholar
  23. Sangsuriyachot N, Mi H, Sugimoto M (2011) Novel interaction techniques by combining hand and foot gestures on tabletop environments. In: Proceeding of ITS ‘11. ACM, New York, pp 268–269Google Scholar
  24. Sparacino F (2004) Scenographies of the past and museums of the future: from the wunderkammer to body-driven interactive narrative spaces. In: Proceeding of MM ‘04, pp 72–79Google Scholar
  25. Stamatakis D, Grammenos D, Magoutis K (2011) Real-time analysis of localization data streams for ambient intelligence environments. In: The proceedings of AmI 11: international joint conference on ambient intelligence, Amsterdam, Springer, Berlin, Heidelberg, pp 92–97, 16–18 Nov 2011Google Scholar
  26. Sweetser P, Wyeth P (2005) GameFlow: a model for evaluating player enjoyment in games. Comp Entertain 3(3):1–24CrossRefGoogle Scholar
  27. Valkov D, Steinicke F, Bruder B, Hinrichs K (2010) Traveling in 3D virtual environments with foot gestures and a multitouch enabled world in miniature. In: Proceeding of VRIC 2012. IEEE, pp 171–180Google Scholar
  28. Yoo B, Han J-J,Choi C, Yi K, Suh S, Partk D, Kim C (2010) 3D user interface combining gaze and hand gestures for large-scale display. In: Proceeding of CHI ‘10. ACM, New York, pp 3709–3714Google Scholar
  29. Zabulis X, Grammenos D, Argyros A, Sifakis M, Stephanidis C (2011) Macedonia: From Fragments to Pixels. ERCIM News Special Theme: ICT Cult Heritage 86:25–26Google Scholar
  30. Zabulis X, Grammenos D, Sarmis T, Tzevanidis K, Argyros AA (2010) Exploration of large-scale museum artifacts through noninstrumented, location-based, multi-user interaction. In: Proceedings of VAST’2010, Palais du Louvre, Paris, France, pp 155-162, 21–24 Sept 2010Google Scholar
  31. Zabulis X, Grammenos D, Sarmis T, Tzevanidis K, Padeleris P, Koutlemanis P, Argyros AA (2012) Multicamera human detection and tracking supporting natural interaction with large scale displays. Mach Vis Appl J 24(2):319–336. Feb 2013Google Scholar

Copyright information

© Springer Science+Business Media Singapore 2014

Authors and Affiliations

  • Dimitris Grammenos
    • 1
  • Giannis Drossis
    • 1
  • Xenophon Zabulis
    • 1
  1. 1.Foundation for Research and Technology-Hellas (FORTH)Institute of Computer ScienceHeraklionGreece

Personalised recommendations