Advertisement

Personal and Ubiquitous Computing

, Volume 14, Issue 2, pp 83–94 | Cite as

Interaction with large ubiquitous displays using camera-equipped mobile phones

  • Seokhee Jeon
  • Jane Hwang
  • Gerard J. Kim
  • Mark Billinghurst
Original Article

Abstract

In the ubiquitous computing environment, people will interact with everyday objects (or computers embedded in them) in ways different from the usual and familiar desktop user interface. One such typical situation is interacting with applications through large displays such as televisions, mirror displays, and public kiosks. With these applications, the use of the usual keyboard and mouse input is not usually viable (for practical reasons). In this setting, the mobile phone has emerged as an excellent device for novel interaction. This article introduces user interaction techniques using a camera-equipped hand-held device such as a mobile phone or a PDA for large shared displays. In particular, we consider two specific but typical situations (1) sharing the display from a distance and (2) interacting with a touch screen display at a close distance. Using two basic computer vision techniques, motion flow and marker recognition, we show how a camera-equipped hand-held device can effectively be used to replace a mouse and share, select, and manipulate 2D and 3D objects, and navigate within the environment presented through the large display.

Keywords

Interaction Motion flow Marker recognition Interaction techniques Cell/mobile phones Large display 

Notes

Acknowledgments

The Lucas–Kanade feature tracker for Symbian OS was kindly provided by ACID (Australian CRC for Interaction Design) for our implementation results. This research is financially supported by the Ministry of Knowledge Economy (MKE) and Korea Institute for Advancement in Technology (KIAT) through the Workforce Development Program in Strategic Technology.

References

  1. 1.
    Norman DA (1998) The invisible computer. MIT Press, CambridgeGoogle Scholar
  2. 2.
    Wisneski C, Ishii H, Dahley A, Gorbet M, Brave S, Ullmer B, Yarin P (1998) Ambient displays: turning architectural space into an interface between people and digital information. In: Proceedings of the first international workshop on cooperative buildings (CoBuild ‘98), pp 22–32Google Scholar
  3. 3.
    Lashina T (2004) Intelligent bathroom. In: European Symposium on Ambient Intelligence, Eindhoven, NetherlandsGoogle Scholar
  4. 4.
    Fitzmaurice GW, Zhai S, Chignell MH (1993) Virtual reality for palmtop computers. ACM Trans Info Syst 11(3):197–218CrossRefGoogle Scholar
  5. 5.
    Watsen K, Darken RP, Capps M (1999) A handheld computer as an interaction device to a virtual environment. In: Proceedings of the third immersive projection technology workshopGoogle Scholar
  6. 6.
    Kukimoto N, Furusho Y, Nonaka J, Koyamada K, Kanazawa M (2003) Pda-based visualization control and annotation interface for virtual environment. In: Proceeding of 3rd IASTED international conference visualization, image and image processingGoogle Scholar
  7. 7.
    Mantyla V-M, Mantyjarvi J, Seppanen T, Tuulari E (2000) Hand gesture recognition of a mobile device user. In: Proceedings of the IEEE international conference on multi-media and expo, pp 281–284Google Scholar
  8. 8.
    Bayon V, Griffiths G (2003) Co-located interaction in virtual environments via de-coupled interfaces. In: Proceedings of HCI international, pp 1391–1395Google Scholar
  9. 9.
    Hachet M, Pouderoux J, Guitton P (2005) A camera-based interface for interaction with mobile handheld computers. In: Proceedings of the symposium on interactive 3D graphics and games, pp 65-72Google Scholar
  10. 10.
    Lourakis M, Argyros A (2005) Efficient, causal camera tracking in unprepared environments. Comput Vis Image Underst 99(2):259–290CrossRefGoogle Scholar
  11. 11.
    Wagner D, Schmalstieg D (2003) First steps towards handheld augmented reality. In: Proceedings of the 7th international conference on wearable computers, p 127Google Scholar
  12. 12.
    Paelke V, Reimann C, Stichling D (2004) Foot-based mobile interaction with games. In: ACM SIGCHI international conference on advances in computer entertainment technology (ACE), pp 321–324Google Scholar
  13. 13.
    Hachet M, Kitamura Y (2005) 3D interaction with and from handheld computers. In: Proceedings of the IEEE VR 2005 workshop: new directions in 3D user interfaces, pp 11–14Google Scholar
  14. 14.
    Hansen TR, Eriksson E, Lykke-Olesen A (2005) Mixed interaction space: Designing for camera based interaction with mobile devices. In: Proceedings of ACM CHI 2005 conference on human factors in computing systems, pp 1933–1936Google Scholar
  15. 15.
    Kruppa M, Krüger A (2003) Concepts for a combined use of personal digital assistants and large remote displays. In: Proceedings of simulation und visualisierung, SCS Publishing House e.V, San Diego, pp 349–362Google Scholar
  16. 16.
    Ballagas R, Rohs M, Sheridan JG (2005) Sweep and point & shoot: phonecam-based interactions for large public displays. In: Conference on human factors in computing systems, pp 1200–1203Google Scholar
  17. 17.
    Rohs M, Zweifel P (2005) A conceptual framework for camera phone-based interaction techniques (PERVASIVE 2005). Lect Notes Comp Sci 3468:171–189CrossRefGoogle Scholar
  18. 18.
    Madhavapeddy A, Scott D, Sharp R, Upton E, The Spotcode project website. http://www.cl.cam.ac.uk/Research/SRG/netos/uid/spotcode.html
  19. 19.
    Miyahara K, Inoue H, Tsunesada Y, Sugimoto M (2005) Intuitive manipulation techniques for projected displays of mobile devices. In: Conference on human factors in computing systems, pp 1657–1660Google Scholar
  20. 20.
    Wu M, Balakrishnan R (2003) Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. In: Proceedings of the ACM UIST, pp 193–202Google Scholar
  21. 21.
    Shen C, Vernier F, Forlines C, Ringel M (2004) DiamondSpin: an extensible toolkit for around the table interaction. In: Conference on human factors in computing systems, pp. 167–174Google Scholar
  22. 22.
    Cavens D, Vogt F, Fels S, Meitner M (2002) Interacting with the big screen: pointers to ponder. In: Conference on human factors in computing systems, pp 678–679Google Scholar
  23. 23.
    Regenbrecht H, Haller M, Hauber J, Billinghurst M (2006) Carpeno: interfacing remote collaborative virtual environments with table-top interaction. Virtual Real Syst Dev Appl 10(2):95–107CrossRefGoogle Scholar
  24. 24.
    Maringelli F, Mccarthy J, Slater M, Steed A (1998) The influence of body movement on subjective presence in virtual environments. Hum Factors 40(3):469–477CrossRefGoogle Scholar
  25. 25.
    Nokia 6630 symbian OS phone. Avaiable at http://www.symbian.com/phones/nokia_6630.html
  26. 26.
    Kato H, Billinghurst M (1999) Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In: Proceedings of the 2nd international workshop on augmented reality, pp 85–94Google Scholar
  27. 27.
    NextWindow 2100 series touch frame. Available at http://www.nextwindow.com/products/2100
  28. 28.
    Bouguet JY (2003) Pyramidal implementation of the Lucas Kanade feature tracker description of the algorithm Intel Corporation, Intel Corporation, Microprocessor Research LabsGoogle Scholar
  29. 29.
    Shi J, Tomasi C (1994) Good features to track. In: Proceedings of IEEE conference on computer vision and pattern recognition, pp 593–600Google Scholar
  30. 30.
    Chen M, Mountford SJ, Sellen A (1988) A study in interactive 3D rotation using 2D control devices. In: Proceedings of SIGGRAPH’88, pp 121–129Google Scholar
  31. 31.
    Jacob I, Oliver J (1995) Evaluation of techniques for specifying 3D rotations with 2D input device. In: Proceedings of human computer interaction, pp 63–76Google Scholar
  32. 32.
    Nintendo Wii. Available at http://wii.nintendo.com
  33. 33.
    Abawi D, Bienwald J, Dorner R (2004) Accuracy in optical tracking with fiducial markers: An accuracy function for ARToolKit. In: Proceedings of IEEE and ACM international symposium on mixed and augmented reality, pp 260–261Google Scholar
  34. 34.
    Yim S, Hwang J, Choi S, Kim GJ (2007) Image browsing in mobile device using user motion tracking. In: Proceedings of the international symposium on ubiquitous virtual realityGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2009

Authors and Affiliations

  • Seokhee Jeon
    • 1
  • Jane Hwang
    • 2
  • Gerard J. Kim
    • 3
  • Mark Billinghurst
    • 4
  1. 1.Department of Computer Science and EngineeringPOSTECHPohangKorea
  2. 2.Image and Media Research CenterKorea Institute of Science and TechnologySeoulKorea
  3. 3.College of Information and CommunicationKorea UniversitySeoulKorea
  4. 4.Human Interface Technology Laboratory NZUniversity of CanterburyChristchurchNew Zealand

Personalised recommendations