Personal and Ubiquitous Computing

, Volume 11, Issue 8, pp 621–632 | Cite as

Movement-based interaction in camera spaces: a conceptual framework

  • Eva Eriksson
  • Thomas Riisgaard Hansen
  • Andreas Lykke-Olesen
Original Article

Abstract

In this paper we present three concepts that address movement-based interaction using camera tracking. Based on our work with several movement-based projects we present four selected applications, and use these applications to leverage our discussion, and to describe our three main concepts space, relations, and feedback. We see these as central for describing and analysing movement-based systems using camera tracking and we show how these three concepts can be used to analyse other camera tracking applications.

Keywords

Movement-based interaction Camera spaces Mixed interaction spaces Camera interaction Mobile phone interaction 

References

  1. 1.
    Barkhuus M, Chalmers M, Tennent, P, Bell M, Hall M, Sherwood S, Brown B (2005) Picking pockets on the lawn: the development of tactics and strategies in a mobile game. In: Proceedings of ubicomp, Springer 3-540-28760-4, pp 358–374Google Scholar
  2. 2.
    Dance Dance Revolution UltraMix 3. Available at http://www.konami-usa.com/production/ddr_ultramix3/index.html
  3. 3.
    Samsung SCH-S310Available at http://www.samsung.com/
  4. 4.
    Patridge K, Chatterjee S, Sazawal V, Borriello G, Want R (2002) TiltType: accelerometer-supported text entry for very small devices. In: Proceedings of the 15th annual ACM symposium on user interface software and technology UIST’02, Paris, 27–30 October 2002, pp 201–204Google Scholar
  5. 5.
    Pavlovic V, Sharma R, Huang T (1997) Visual Interpretation of Hand Gestures for Human-Computer Interaction. IEEE Trans Pattern Anal Machine Intell 19(7):677–695CrossRefGoogle Scholar
  6. 6.
    Bradski G, (1998) Computer vision face tracking for use in a perceptual user interface, intel technology. J Q2, pp 1–15Google Scholar
  7. 7.
    Hansen TR, Eriksson E, Lykke-Olesen A (2005) Mixed interaction space – designing for camera based interaction with mobile devices. In: Conference on human factors in computing systems in CHI ’05 extended abstracts on human factors in computing systems, Portland, ACM PressGoogle Scholar
  8. 8.
    Mackay W (1998) Augmented reality: linking real and virtual worlds: a new paradigm for interacting with computers. In: Proceedings of the working conference on advanced visual interfaces, L’Aquila, 24–27 May 1998Google Scholar
  9. 9.
    Benford S, Gaver B, Boucher A, Walker B, Pennington S, Schmidt A, Gellersen H, Steed A., Schnädelbach H, Koleva B, Anastasi R, Greenhalgh C, Rodden T, Green J, Ghali A, Pridmore T (2005) Expected, sensed, and desired: a framework for designing sensing-based interaction. ACM transactions on computer-human interaction (TOCHI) vol. 12 , Issue 1. ACM Press, pp 3–30Google Scholar
  10. 10.
    Abowd G, Mynatt E (2000) Charting past, present, and future research in ubiquitous computing. In: ACM transactions on computer–human interaction (TOCHI), vol 7 , Issue 1. Special issue on human-computer interaction in the new millennium, part 1, ACM Press, pp 29–58Google Scholar
  11. 11.
    Ullmer B, Ishii H (2000) Emerging frameworks for tangible user interfaces. IBM Syst J 39(3–4):915–931CrossRefGoogle Scholar
  12. 12.
    Holmquist LE, Redström J, Ljungstrand P (1999) Token-based access to digital information. In: Proceedings of the 1st international symposium on handheld and ubiquitous computing (HUC’99), Karlsruhe, pp 234–245Google Scholar
  13. 13.
    Fishkin K (2004) A taxonomy for and analysis of tangible interfaces. Pers Ubiquit Comput 8(5):347–358CrossRefGoogle Scholar
  14. 14.
    Underkoffler J, Ishii H (1999) Urp: a luminous-tangible workbench for urban planning and design. In: Conference on human factors in computing systems, proceedings of the SIGCHI conference on human factors in computing systems. Pittsburgh, Pennsylvania, pp 386–393Google Scholar
  15. 15.
    Krogh PG, Ludvigsen M, Lykke-Olesen A (2004) “Help Me Pull That Cursor”—a collaborative interactive floor enhancing community interaction. In: Proceedings of OZCHI 2004, Wollongong, CD-ROM, 22–24 November 2004. ISBN:1 74128 079Google Scholar
  16. 16.
    StorySurfer. Available at http://www.interactivespaces.net
  17. 17.
    Hansen TR, Eriksson E, Lykke-Olesen A (2005) Mixed interaction spaces—expanding the interaction space with mobile devices. In: Proceedings of the 19th British HCI Group annual conference, Edingburgh, 5–9 September 2005Google Scholar
  18. 18.
    Hansen TR, Eriksson E, Lykke-Olesen A (2005) Mixed interaction spaces—a new interaction technique for mobile devices. In: Proceedings of the 7th international conference on ubiquitous computing, TokyoGoogle Scholar
  19. 19.
    Forty A (2000) Words and buildings—a vocabulary of modern architecture. Thames and Hudson, New YorkGoogle Scholar
  20. 20.
    Gehl J, (1987) Life between buildings—using public space, Van Nostrand Reinhold, New YorkGoogle Scholar
  21. 21.
    Bellotti V, Back M, Edwards WK, Henderson A, Lopes C. (2002) Making sense of sensing systems: five questions for designers and researchers. Conference on human factors in computing systems. In: Proceedings of CHI 2002, ACM Press, Minneapolis, pp 471–509Google Scholar
  22. 22.
    Larssen AT, Loke L, Robertson T, Edwards J (2004) Movement as input for interaction—a study and evaluation of two eyetoy (TM) games. In: Proceedings of OZCHI 2004, Wollongong, CD-ROM, 22–24 November 2004. ISBN:1 74128 079Google Scholar
  23. 23.
    Bødker S (1996) Applying activity theory to video analysis: how to make sense of video data in human-computer interaction. In: Nardi B (ed) Context and consciousness. Activity theory and human computer interaction, MIT press, pp 147–174Google Scholar
  24. 24.
    Sony Eye-Toy. Available at http://www.eyetoy.com
  25. 25.
    Lyons M, Haehnel M, Tetsutani N (2001) The mouthesizer: a facial gesture musical interface. conference abstracts, Siggraph 2001, Los Angeles, p 230Google Scholar
  26. 26.
    Hämäläinen P, Ilmonen T, Höysniemi J, Lindholm M, Nykänen A (2005) Martial arts in artificial reality. In: Proceedings of CHI 2005 conference on human factors in computing systems, Portland, pp 781–790Google Scholar
  27. 27.
    Henrysson A, Billinghurst M, Ollila M (2005) Face to face collaborative AR on mobile phones. In: Proceedings of the international symposium on mixed and augmented reality (ISMAR 2005), ViennaGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2006

Authors and Affiliations

  • Eva Eriksson
    • 1
  • Thomas Riisgaard Hansen
    • 2
  • Andreas Lykke-Olesen
    • 3
  1. 1.IDC—Interaction Design Collegium, Department of Computer Science and EngineeringChalmers University of TechnologyGothenburgSweden
  2. 2.Center for Pervasive Healthcare, Department of Computer scienceAarhus UniversityAarhusDenmark
  3. 3.Center for InteractiveSpaces, Institute for DesignAarhus School of ArchitectureAarhusDenmark

Personalised recommendations