Skip to main content

Public Systems Supporting Noninstrumented Body-Based Interaction

  • Chapter
  • First Online:
Playful User Interfaces

Abstract

Body-based interaction constitutes a very intuitive way for humans to communicate with their environment but also among themselves. Nowadays, various technological solutions allow for fast and robust, noninstrumented body tracking at various levels of granularity and sophistication. This chapter studies three distinct cases showcasing different representative approaches of employing body-based interaction for the creation of public systems, in two application domains: culture and marketing. The first case is a room-sized exhibit at an archeological museum, where multiple visitors concurrently interact with a large wall projection through their position in space, as well as through the path they follow. The second example is an “advergame” used as a means of enhancing the outdoor advertising campaign of a food company. In this case, players interact with the wall-projected game world through a virtual, two-dimensional shadow of their body. Finally, the third case presents a public system for exploring timelines in both two and three dimensions that supports detailed body tracking in combination with single-hand, two-hands, and leg gestures. Design considerations are provided for each case, including related benefits and shortcomings. Additionally, findings stemming from user-based evaluations and field observations on the actual use of these systems are presented, along with pointers to potential improvements and upcoming challenges.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://www.makedonopixels.org

  2. 2.

    Videos of indicative play sessions can be found at: http://www.youtube.com/user/icsforthami.

  3. 3.

    https://www.facebook.com/creativecrete

References

  • Argyros AA, Lourakis MIA (2004) Real time tracking of multiple skin-colored objects with a possibly moving camera. In: Proceedings of the European conference on computer vision (ECCV’04), vol 3. Springer, Prague, Czech Republic, pp 368-379, 11–14 May 2004

    Google Scholar 

  • Blomberg J, Giacomi J, Mosher A, Swenton-Wall P (2003) Ethnographic field methods and their relation to design. In: Participatory design: principles and practices. Lawrence Erlbaum Associates, pp 123–155

    Google Scholar 

  • Bobick AF, Intille S S, Davis JW, Baird F, Pinhanez CS, Campbell LW, Ivanov YA, SchütteA, Wilson A (1999) The KidsRoom: a perceptually-based interactive and immersive story environment. Presence: teleoper. Virtual Environ 8(4):369–393

    Google Scholar 

  • Brooke J (1996) SUS: a quick and dirty usability scale. Taylor and Francis, London, pp 189–194

    Google Scholar 

  • Crossan A, Brewster S, Ng A (2010) Foot tapping for mobile interaction. In: Proceedings of BCS ‘10, British Computer Society Swinton, pp 418–422

    Google Scholar 

  • Drossis G, Grammenos D, Adami I, Stephanidis C (2013) 3D visualization and multimodal interaction with temporal information using timelines. In: proceedings of INTERACT 2013, lecture notes in computer science, vol 8119. Springer, Heidelberg, pp 214–231

    Google Scholar 

  • Fikkert W, van der Vet P, van der Veer G, Nijholt A (2010) Gestures for large display control. In: Proceedings of GW’09, LNCS 5934. Springer, Heidelberg, pp 245–256

    Google Scholar 

  • Grammenos D, Margetis G, Koutlemanis P, Zabulis X (2012) 53.090 Virtual rusks = 510 real smiles using a fun exergame installation for advertising traditional food products. In: Nijholt A, Romão T, Reidsma D (eds) Advances in computer entertainment, LNCS 7624. Springer, Heidelberg, pp 214–229

    Google Scholar 

  • Grønbæk K, Iversen OS, Kortbek KJ, Nielsen KR, Aagaard L (2007) iGameFloor: a platform for co-located collaborative games. Proceedings of ACE ‘07, vol 203. ACM, New York, pp 64–71

    Google Scholar 

  • Hilliges O, Izadi S, Wilson A, Hodges S, Mendoza AG, Butz A (2009) Interactions in the air: adding further depth to interactive tabletop. In: Proceedings of UIST ‘09. ACM, New York, pp 139–148

    Google Scholar 

  • Jaimes A, Sebe N (2007) Multimodal human–computer interaction: a survey. Journal computer vision and image understanding archive, vol 108(Issue 1–2). ACM, New York, pp 116–134

    Google Scholar 

  • Jenkins H (2002) Game design as narrative architecture. In: Harrington P and Frup-Waldrop N (eds) First person. MIT Press, Cambridge, pp 118–130

    Google Scholar 

  • Kortbek KJ, Grønbæk K (2008) Interactive spatial multimedia for communication of art in the physical museum space. In: Proceeding of MM ‘08, pp 609–618

    Google Scholar 

  • Krueger MW, Gionfriddo T, Hinrichsen K (1985) VIDEOPLACE—an artificial reality. In: Proceedings of CHI’85, San Francisco, pp 35–40

    Google Scholar 

  • Laakso S, Laakso M (2006) Design of a body-driven multiplayer game system. Comp Entertain 4(4):7

    Article  Google Scholar 

  • Lindley SE, Le Couteur J, Berthouze NL (2008) Stirring up experience through movement in game play: effects on engagement and social behaviour. In: Proceeding of CHI ‘08, pp 511–514

    Google Scholar 

  • Mueller F, Agamanolis S, Picard R (2003) Exertion interfaces: sports over a distance for social bonding. In: Proceedings of CHI ‘03, pp 561–568

    Google Scholar 

  • Nickel K, Stiefelhagen R (2003) Pointing gesture recognition based on 3D-tracking of face, hands and head orientation. In: Proceeding of ICMI ‘03. ACM, New York, pp 140–146

    Google Scholar 

  • Papadopoulos C, Sugarman D, Kaufmant A (2012) NuNav3D: a touch-less, body-driven interface for 3D navigation. In: Proceeding of IEEE VR 2012. IEEE, pp 67–68

    Google Scholar 

  • Paradiso J, Abler C, Hsiao K, Reynolds M (1997) The magic carpet: physical sensing for immersive environments. In: CHI ‘97 extended abstracts, pp 277–278

    Google Scholar 

  • Robertson T, Mansfield T, Loke L (2006) Designing an immersive environment for public use. Proceedings of PDC ‘06. ACM, New York, pp 31–40

    Google Scholar 

  • Ronkainen S, Häkkilä J, Kalev S, Colley A, Linjama J (2007) Tap input as an embedded interaction method for mobile devices. In: Proceeding of TEI’07. ACM, New York, pp 263–270

    Google Scholar 

  • Sangsuriyachot N, Mi H, Sugimoto M (2011) Novel interaction techniques by combining hand and foot gestures on tabletop environments. In: Proceeding of ITS ‘11. ACM, New York, pp 268–269

    Google Scholar 

  • Sparacino F (2004) Scenographies of the past and museums of the future: from the wunderkammer to body-driven interactive narrative spaces. In: Proceeding of MM ‘04, pp 72–79

    Google Scholar 

  • Stamatakis D, Grammenos D, Magoutis K (2011) Real-time analysis of localization data streams for ambient intelligence environments. In: The proceedings of AmI 11: international joint conference on ambient intelligence, Amsterdam, Springer, Berlin, Heidelberg, pp 92–97, 16–18 Nov 2011

    Google Scholar 

  • Sweetser P, Wyeth P (2005) GameFlow: a model for evaluating player enjoyment in games. Comp Entertain 3(3):1–24

    Article  Google Scholar 

  • Valkov D, Steinicke F, Bruder B, Hinrichs K (2010) Traveling in 3D virtual environments with foot gestures and a multitouch enabled world in miniature. In: Proceeding of VRIC 2012. IEEE, pp 171–180

    Google Scholar 

  • Yoo B, Han J-J,Choi C, Yi K, Suh S, Partk D, Kim C (2010) 3D user interface combining gaze and hand gestures for large-scale display. In: Proceeding of CHI ‘10. ACM, New York, pp 3709–3714

    Google Scholar 

  • Zabulis X, Grammenos D, Argyros A, Sifakis M, Stephanidis C (2011) Macedonia: From Fragments to Pixels. ERCIM News Special Theme: ICT Cult Heritage 86:25–26

    Google Scholar 

  • Zabulis X, Grammenos D, Sarmis T, Tzevanidis K, Argyros AA (2010) Exploration of large-scale museum artifacts through noninstrumented, location-based, multi-user interaction. In: Proceedings of VAST’2010, Palais du Louvre, Paris, France, pp 155-162, 21–24 Sept 2010

    Google Scholar 

  • Zabulis X, Grammenos D, Sarmis T, Tzevanidis K, Padeleris P, Koutlemanis P, Argyros AA (2012) Multicamera human detection and tracking supporting natural interaction with large scale displays. Mach Vis Appl J 24(2):319–336. Feb 2013

    Article  Google Scholar 

Download references

Acknowledgments

This work has been supported by the FORTH-ICS RTD Programme “Ambient Intelligence and Smart Environments”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dimitris Grammenos .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media Singapore

About this chapter

Cite this chapter

Grammenos, D., Drossis, G., Zabulis, X. (2014). Public Systems Supporting Noninstrumented Body-Based Interaction. In: Nijholt, A. (eds) Playful User Interfaces. Gaming Media and Social Effects. Springer, Singapore. https://doi.org/10.1007/978-981-4560-96-2_2

Download citation

  • DOI: https://doi.org/10.1007/978-981-4560-96-2_2

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-4560-95-5

  • Online ISBN: 978-981-4560-96-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics