Interaction with Adaptive and Ubiquitous User Interfaces

  • Jan GugenheimerEmail author
  • Christian Winkler
  • Dennis Wolf
  • Enrico Rukzio
Part of the Cognitive Technologies book series (COGTECH)


Current user interfaces such as public displays, smartphones and tablets strive to provide a constant flow of information. Although they all can be regarded as a first step towards Mark Weiser’s vision of ubiquitous computing they are still not able to fully achieve the ubiquity and omnipresence Weiser envisioned. In order to achieve this goal these devices must be able to blend in with their environment and be constantly available. Since this scenario is technically challenging, researchers simulated this behavior by using projector-camera systems. This technology opens the possibility of investigating the interaction of users with always available and adaptive information interfaces. These are both important properties of a Companion-technology. Such a Companion system will be able to provide users with information how, where and when they are desired. In this chapter we describe in detail the design and development of three projector-camera systems(UbiBeam, SpiderLight and SmarTVision). Based on insights from prior user studies, we implemented these systems as a mobile, nomadic and home deployed projector-camera system which can transform every plain surface into an interactive user interface. Finally we discuss the future possibilities for Companion-systems in combination with a projector-camera system to enable fully adaptive and ubiquitous user interface.


Ubiquitous User Interfaces Projector-camera System Home Deployment Couch Table Quiz Application 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This work was done within the Transregional Collaborative Research Centre SFB/TRR 62 “Companion-Technology for Cognitive Technical Systems” funded by the German Research Foundation (DFG).


  1. 1.
    Biundo, S., Wendemuth, A.: Companion-technology for cognitive technical systems. Künstl. Intell. 30(1), 71–75 (2016). Special issue on companion technologiesGoogle Scholar
  2. 2.
    Ferreira, D., Goncalves, J., Kostakos, V., Barkhuus, L., Dey, A.K.: Contextual experience sampling of mobile application micro-usage. In: Proceedings of the 16th International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI ’14, pp. 91–100. ACM, New York (2014). doi:10.1145/2628363.2628367.
  3. 3.
    Gugenheimer, J., Knierim, P., Seifert, J., Rukzio, E.: Ubibeam: an interactive projector-camera system for domestic deployment. In: Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, ITS ’14, pp. 305–310. ACM, New York (2014). doi:10.1145/2669485.2669537.
  4. 4.
    Gugenheimer, J., Honold, F., Wolf, D., Schüssel, F., Seifert, J., Weber, M., Rukzio, E.: How companion-technology can enhance a multi-screen television experience: a test bed for adaptive multimodal interaction in domestic environments. KI-Künstl. Intell. 30, 1–8 (2015)Google Scholar
  5. 5.
    Gugenheimer, J., Knierim, P., Winkler, C., Seifert, J., Rukzio, E.: Ubibeam: exploring the interaction space for home deployed projector-camera systems. In: Human-Computer Interaction–INTERACT 2015, pp. 350–366. Springer, Berlin (2015)Google Scholar
  6. 6.
    Hardy, J.: Reflections: a year spent with an interactive desk. Interactions 19(6), 56–61 (2012). doi:10.1145/2377783.2377795. CrossRefGoogle Scholar
  7. 7.
    Hardy, J., Alexander, J.: Toolkit support for interactive projected displays. In: Proceedings of MUM 2012, pp. 42:1–42:10. ACM, New York (2012). doi:10.1145/2406367.2406419.
  8. 8.
    Harrison, C., Benko, H., Wilson, A.D.: Omnitouch: wearable multitouch interaction everywhere. In: Proceedings of UIST 2011, pp. 441–450. ACM, New York (2011). doi:10.1145/2047196.2047255.
  9. 9.
    Hassenzahl, M.: The thing and I: understanding the relationship between user and product. In: Funology, pp. 31–42. Springer, Berlin (2005)Google Scholar
  10. 10.
    Huber, J., Steimle, J., Liao, C., Liu, Q., Mühlhäuser, M.: Lightbeam: interacting with augmented real-world objects in pico projections. In: Proceedings of MUM 2012, pp. 16:1–16:10. ACM, New York (2012). doi:10.1145/2406367.2406388.
  11. 11.
    Karitsuka, T., Sato, K.: A wearable mixed reality with an on-board projector. In: Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR ’03, pp. 321. IEEE Computer Society, Washington, DC (2003).
  12. 12.
    Linder, N., Maes, P.: Luminar: portable robotic augmented reality interface design and prototype. In: Adjunct Proceedings UIST 2010, pp. 395–396. ACM, New York (2010). doi:10.1145/1866218.1866237.
  13. 13.
    Lumo interactive projector. Accessed 18 April 2015
  14. 14.
    Mistry, P., Maes, P.: Sixthsense: a wearable gestural interface. In: ACM SIGGRAPH ASIA 2009 Sketches, SIGGRAPH ASIA ’09, pp. 11:1–11:1. ACM, New York (2009). doi:10.1145/1667146.1667160.
  15. 15.
    Pinhanez, C.S.: The everywhere displays projector: a device to create ubiquitous graphical interfaces. In: Proceedings of UbiComp 2001, pp. 315–331. Springer, London (2001).
  16. 16.
    Raskar, R., van Baar, J., Beardsley, P., Willwacher, T., Rao, S., Forlines, C.: iLamps: geometrically aware and self-configuring projectors. In: ACM SIGGRAPH 2003 Papers, SIGGRAPH ’03, pp. 809–818. ACM, New York (2003). doi:10.1145/1201775.882349.
  17. 17.
    Strauss, A.L., Corbin, J.M., et al.: Basics of Qualitative Research, vol. 15. Sage, Newbury Park, CA (1990)Google Scholar
  18. 18.
    Tamaki, E., Miyaki, T., Rekimoto, J.: Brainy hand: an ear-worn hand gesture interaction device. In: CHI ’09 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’09, pp. 4255–4260. ACM, New York (2009). doi:10.1145/1520340.1520649.
  19. 19.
    Weiser, M.: The computer for the 21st century. SIGMOBILE Mob. Comput. Commun. Rev. 3(3), 3–11 (1999). doi:10.1145/329124.329126. CrossRefGoogle Scholar
  20. 20.
    Wilson, A.D.: Using a depth camera as a touch sensor. In: Proceedings of ITS 2010, ITS ’10, pp. 69–72. ACM, New York (2010). doi:10.1145/1936652.1936665.
  21. 21.
    Wilson, A., Benko, H., Izadi, S., Hilliges, O.: Steerable augmented reality with the Beamatron. In: Proceedings of UIST 2012, pp. 413–422. ACM, New York (2012). doi:10.1145/2380116.2380169.
  22. 22.
    Winkler, C., Reinartz, C., Nowacka, D., Rukzio, E.: Interactive phone call: synchronous remote collaboration and projected interactive surfaces. In: Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, ITS ’11, pp. 61–70. ACM, New York (2011). doi:10.1145/2076354.2076367.
  23. 23.
    Winkler, C., Seifert, J., Dobbelstein, D., Rukzio, E.: Pervasive information through constant personal projection: the ambient mobile pervasive display (AMP-D). In: Proceedings of CHI 2014, pp. 4117–4126. ACM, New York (2014). doi:10.1145/2556288.2557365.
  24. 24.
    Xiao, R., Harrison, C., Hudson, S.E.: Worldkit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces. In: Proceedings of CHI 2013, pp. 879–888. ACM, New York (2013). doi:10.1145/2470654.2466113.

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Jan Gugenheimer
    • 1
    Email author
  • Christian Winkler
    • 1
  • Dennis Wolf
    • 1
  • Enrico Rukzio
    • 1
  1. 1.Institute of Media InformaticsUlm UniversityUlmGermany

Personalised recommendations