Skip to main content
Log in

Connecting and Controlling Appliances Through Wearable Augmented Reality

  • Original Paper
  • Published:
Augmented Human Research Aims and scope Submit manuscript

Abstract

The number of interconnected devices around us is constantly growing, and it may become challenging for users to control all these devices when control interfaces are distributed over mechanical elements, apps, and configuration webpages. If devices are even supposed to be connected and work together, this challenge is intensified. An alternative way for configuring and controlling devices in situ is enabled by wearable technologies. In this paper, we investigate interaction methods for smart appliances in augmented reality from an egocentric perspective. We examine how users can control appliances through augmented reality directly. The physical objects are augmented with interaction widgets, which are generated on demand and represent the connected devices along with their adjustable parameters. For example, a widget can be overlaid on a loudspeaker to control its volume. We explore three ways of manipulating the virtual widgets: (1) in-air finger pinching and sliding, (2) whole-arm gestures rotating and waving, (3) incorporating physical objects in the surrounding and mapping their movements to the interaction primitives. We compare these methods in a user study with 25 participants and find significant differences in the preference of the users, the speed of executing commands, and the granularity of the type of control. While these methods only apply to controlling a single device at a time, in a second part, we create a method to also take potential connections between devices into account. Users can view and configure connections between smart devices in augmented reality and furthermore can manipulate them or create new device connections using simple gestures. This facilitates the understanding of existing connections and their modification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

Notes

  1. www.vuforia.com.

  2. www.lifx.com.

  3. http://www.ueq-online.org/.

References

  1. Avrahami D, Wobbrock J, Izadi S (2011) Portico: tangible interaction on and around a tablet. In: Proceedings of the 2011 symposium on user interface software and technology, UIST ’11, pp 347–356

  2. Bartindale T, Jackson D, Ladha K, Mellor S, Olivier P, Wright P (2014) Redtag: automatic content metadata capture for cameras. In: Proceedings of the 2014 international conference on interactive experiences for TV and online video, TVX ’14, pp 19–22

  3. Becker V, Rauchenstein F, Sörös G (2019) Investigating universal appliance control through wearable augmented reality. In: Proceedings of the 2019 international conference augmented human, AH ’19, pp 41:1–41:9

  4. Berghmans B, Faes A, Kaminski M, Todi K (2016) Household survival: immersive room-sized gaming using everyday objects as weapons. In: Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems, CHI EA ’16, pp 168–171

  5. Boldu R, Zhang H, Cortés J, Muthukumarana S, Nanayakkara S (2017) Insight: a systematic approach to create dynamic human-controller-interactions. In: Proceedings of the 2017 international conference augmented human, AH ’17, pp 26:1–26:5

  6. Cheng K, Liang R, Chen B, Laing R, Kuo S (2010) iCon: utilizing everyday objects as additional, auxiliary and instant tabletop controllers. In: Proceedings of the 2010 CHI conference on human factors in computing systems, CHI ’10, pp 1155–1164

  7. Cowan B, Pantidi N, Coyle D, Morrissey K, Clarke P, Al-Shehri S, Earley D, Bandeira N (2017) “What can I help you with?”: Infrequent users’ experiences of intelligent personal assistants. In: Proceedings of the 2017 international conference on human–computer interaction with mobile devices and services, MobileHCI ’17, pp 43:1–43:12

  8. de Freitas A, Nebeling M, Chen X, Yang J, Ranithangam K, Kirupa A, Dey A (2016) Snap-to-it: a user-inspired platform for opportunistic device interactions. In: Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems, CHI ’16, pp 5909–5920

  9. Funk M, Korn O, Schmidt A (2014) An augmented workplace for enabling user-defined tangibles. In: Proceedings of the 2014 CHI conference extended abstracts on human factors in computing systems, CHI EA ’14, pp 1285–1290

  10. Henderson S, Feiner S (2008) Opportunistic controls: leveraging natural affordances as tangible user interfaces for augmented reality. In: Proceedings of the 2008 ACM symposium on virtual reality software and technology, VRST ’08, pp 211–218

  11. Heun V, Hobin J, Maes P (2013) Reality editor: programming smarter objects. In: Proceedings of the 2013 conference on pervasive and ubiquitous computing adjunct publication, UbiComp ’13 Adjunct, pp 307–310

  12. Heun V, Kasahara S, Maes P (2013) Smarter objects: using AR technology to program physical objects and their interactions. In: Proceedings of the 2013 CHI conference extended abstracts on human factors in computing systems, CHI EA ’13, pp 2939–2942

  13. Huo K, Cao Y, Yoon S, Xu Z, Chen G, Ramani K (2018) Scenariot: spatially mapping smart things within augmented reality scenes. In: Proceedings of the 2018 CHI conference on human factors in computing systems, CHI ’18, pp 219:1–219:13

  14. Ishii H, Ullmer B (1997) Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the 1997 CHI conference on human factors in computing systems, CHI ’97, pp 234–241

  15. Kasahara S, Niiyama R, Heun V, Ishii H (2013) exTouch: spatially-aware embodied manipulation of actuated objects mediated by augmented reality. In: Proceedings of the 2013 international conference on tangible, embedded and embodied interaction, TEI ’13, pp 223–228

  16. Kriesten B, Mertes C, Tünnermann R, Hermann T (2010) Unobtrusively controlling and linking information and services in smart environments. In: Proceedings of the 2010 Nordic conference on human–computer interaction: extending boundaries, NordiCHI ’10, pp 276–285

  17. Kubitza T (2016) Apps for environments: demonstrating pluggable apps for multi-device IoT-setups. In: Proceedings of the 2016 international conference on the internet of things, IoT’16, pp 185–186

  18. Kubitza T, Schmidt A (2015) Towards a toolkit for the rapid creation of smart environments. In: End-user development, pp 230–235

  19. Laugwitz B, Held T, Schrepp M (2008) Construction and evaluation of a user experience questionnaire. In: USAB ’08

  20. Ledo D, Greenberg S, Marquardt N, Boring S (2015) Proxemic-aware controls: designing remote controls for ubiquitous computing ecologies. In: Proceedings of the 2015 international conference on human–computer interaction with mobile devices and services, MobileHCI ’15, pp 187–198

  21. Lin S, Cheng H, Li W, Huang Z, Hui P, Peylo C (2017) Ubii: physical world interaction through augmented reality. IEEE Trans Mobile Comput 16:872–885

    Article  Google Scholar 

  22. Mattern F (2003) From smart devices to smart everyday objects. In: SOC ’03

  23. Mayer S, Hassan Y, Sörös G (2014) A magic lens for revealing device interactions in smart environments. In: SIGGRAPH Asia 2014 mobile graphics and interactive applications, SA ’14, pp 9:1–9:6

  24. Mayer S, Sörös G (2014) User interface beaming—seamless interaction with smart things using personal wearable computers. In: 2014 international conference on wearable and implantable body sensor networks workshops, BSN ’14, pp 46–49

  25. Mayer S, Tschofen A, Dey A, Mattern F (2014) User interfaces for smart things—a generative approach with semantic interaction descriptions. ACM Trans Comput Hum Interact. https://doi.org/10.1145/2584670

    Article  Google Scholar 

  26. Park H, Koh K, Choi Y, Chae H, Hwang J, Seo J (2016) Defining rules among devices in smart environment using an augmented reality headset. In: Urb-IoT ’16

  27. Pohl H, Rohs M (2014) Around-device devices: my coffee mug is a volume dial. In: Proceedings of the 2014 international conference on human–computer interaction with mobile devices and services, MobileHCI ’14, pp 81–90

  28. Schmidt D, Molyneaux D, Cao X (2012) Picontrol: using a handheld projector for direct control of physical devices through visible light. In: Proceedings of the 2012 symposium on user interface software and technology

  29. Sorgalla J, Fleck J, Sachweh S (2018) ARGI: augmented reality for gesture-based interaction in variable smart environments. In: Proceedings of the 2018 international joint conference on computer vision, imaging and computer graphics theory and applications, VISIGRAPP ’18, pp 102–107

  30. Velloso E, Wirth M, Weichel C, Esteves A, Gellersen H (2016) Ambigaze: direct control of ambient devices by gaze. In: Proceedings of the conference on designing interactive systems, DIS ’16, pp 812–817

  31. Xiao R, Laput G, Zhang Y, Harrison C (2017) Deus EM machina: on-touch contextual functionality for smart IoT appliances. In: Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems, CHI ’17, pp 4000–4008

  32. Zhang B, Chen Y, Tuna C, Dave A, Li Y, Lee E, Hartmann B (2014) Hobs: head orientation-based selection in physical spaces. In: Proceedings of the 2014 symposium on spatial user interaction, SUI ’14, pp 17–25

Download references

Acknowledgements

We thank all participants who volunteered for our study for their time, effort, and valuable feedback.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vincent Becker.

Ethics declarations

Conflict of interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Becker, V., Rauchenstein, F. & Sörös, G. Connecting and Controlling Appliances Through Wearable Augmented Reality. Augment Hum Res 5, 2 (2020). https://doi.org/10.1007/s41133-019-0019-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s41133-019-0019-0

Keywords

Navigation