Advertisement

Better Than You Think: Head Gestures for Mid Air Input

  • Katrin Plaumann
  • Jan Ehlers
  • Florian Geiselhart
  • Gabriel Yuras
  • Anke Huckauf
  • Enrico Rukzio
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9298)

Abstract

This paper presents a systematical comparison of pointing gestures in the context of controlling home appliances in smart homes. The pointing gestures were conducted with head, hand, arm and a computer mouse serving as baseline. To the best of our knowledge, we are the first to report on such a systematical comparison of the mentioned modalities. Our results indicate that although being overall slower and more inaccurate than hand and arm gestures, head gestures are more suitable for mid air input than previous research indicated. We show that disadvantages like slowness and inaccuracy can be compensated by a larger target size. In addition, head gestures have the largest learning effect. Considering our results and the possibilities head gestures would provide in daily life, we recommend thinking of head gestures as a feasible input modality besides hand and arm gestures.

Keywords

Pointing gestures Smart home Head gestures Comparative study 

Notes

Acknowledgements

This work was conducted within the project ‘‘Gaze- and Gesture-Based Assistive Systems for Users with Special Need’’ funded by the BMBF and was supported by the Transregional Collaborative Research Centre SFB/TRR 62 Companion Technology for Cognitive Technical Systems funded by the DFG.

References

  1. 1.
    Anastasiou, D., Jian, C., Zhekova, D.: Speech and gesture interaction in an ambient assisted living lab. In: SMIAE 2012, pp. 18–27. Association for Computational Linguistics (2012)Google Scholar
  2. 2.
    Balakrishnan, R., MacKenzie, I.S.: Performance differences in the fingers, wrist, and forearm in computer input control. In: CHI 1997, pp. 303–310. ACM (1997)Google Scholar
  3. 3.
    Boyle, J.B., Shea, C.H.: Wrist and arm movements of varying difficulties. Acta psychologica 137(3), 382–396 (2011)CrossRefGoogle Scholar
  4. 4.
    Brown, J.N.A., Kaufmann, B., Huber, F.J., Pirolt, K.-H., Hitz, M.: … Language in their very gesture’’ first steps towards calm smart home input. In: Holzinger, A., Pasi, G. (eds.) HCI-KDD 2013. LNCS, vol. 7947, pp. 256–264. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  5. 5.
    Cao, X., Villar, N., Izadi, S.: Comparing user performance with single-finger, whole-hand, and hybrid pointing devices. In: CHI 2010, pp. 1643–1646. ACM (2010)Google Scholar
  6. 6.
    Caon, M., Yue, Y., Tscherrig, J., Mugellini, E., Abou Kahled, O.: Context-aware 3D gesture interaction based on multiple kinects. In: AMBIENT 2011, pp. 7–12 (2011)Google Scholar
  7. 7.
    Casiez, G., Roussel, N., Vogel, D.: 1€ filter: a simple speed-based low-pass filter for noisy input in interactive systems. In: CHI 2012, pp. 2527–2530. ACM (2012)Google Scholar
  8. 8.
    Galanakis, G., Katsifarakis, P., Zabulis, X., Adami, I.: Recognition of simple head gestures based on head pose estimation analysis. AMBIENT 2014, 88–96 (2014)Google Scholar
  9. 9.
    Gray, J.O., Jia, P., Hu, H.H., Lu, T., Yuan, K.: Head gesture recognition for hands-free control of an intelligent wheelchair. Ind. Robot Int. J. 34(1), 60–68 (2007)CrossRefGoogle Scholar
  10. 10.
    Grosse-Puppendahl, T., Beck, S., Wilbers, D., Zeiß, S., von Wilmsdorff, J., Kuijper, A.: Ambient gesture-recognizing surfaces with visual feedback. In: Streitz, N., Markopoulos, P. (eds.) DAPI 2014. LNCS, vol. 8530, pp. 97–108. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  11. 11.
    Hirsch, M., Cheng, J., Reiss, A., Sundholm, M., Lukowicz, P., Amft, O.: Hands-free gesture control with a capacitive textile neckband. In: ISWC 2014, pp. 55–58. ACM (2014)Google Scholar
  12. 12.
    Kim, D., Kim, D.: An intelligent smart home control using body gestures. In: ICHIT 2006, pp. 439–446. IEEE (2006)Google Scholar
  13. 13.
    Kim, H.J., Jeong, K.H., Kim, S.K., Han, T.D.: Ambient wall: smart wall display interface which can be controlled by simple gesture for smart home. In: SIGGRAPH Asia 2011 Sketches, p. 1. ACM (2011)Google Scholar
  14. 14.
    Kosunen, I., Jylha, A., Ahmed, I., An, C., Chech, L., Gamberini, L., Cavazza, L., Jacucci, G.: Comparing eye and gesture pointing to drag items on large screens. In: ITS 2013, pp. 425–428. ACM (2013)Google Scholar
  15. 15.
    Langolf, G.D., Chaffin, D.B., Foulke, J.A.: An investigation of Fitts‘ law using a wide range of movement amplitudes. J. Mot. Behav. 8, 113–128 (1976)CrossRefGoogle Scholar
  16. 16.
    Lin, C.-Y., Lin, Y.-B.: Projection-based user interface for smart home environments. In: COMPSACW 2013, pp. 546–549. IEEE (2013)Google Scholar
  17. 17.
    NaturalPoint Inc.: NatNet SDK. https://www.optitrack.com/products/natnet-sdk/ (2015). Accessed 24 March 2015
  18. 18.
    NaturalPoint Inc.: OptiTrack. https://www.optitrack.com/ (2015). Accessed 24 March 2015
  19. 19.
    Neßelrath, R., Lu, C., Schulz, C. H., Frey, J., and Alexandersson, J.: A Gesture Based System for Context–Sensitive Interaction with Smart Homes. In Ambient Assisted Living pp. 209-219, 2011. Springer.CrossRefGoogle Scholar
  20. 20.
    Sambrooks, L., Wilkinson, B.: Comparison of gestural, touch, and mouse interaction with Fitts’ law. In: OzCHI 2013, pp. 119–122. ACM (2013)Google Scholar
  21. 21.
    Schwaller, M., Lalanne, D.: Pointing in the air: measuring the effect of hand selection strategies on performance and effort. In: Holzinger, A., Ziefle, M., Hitz, M., Debevc, M. (eds.) SouthCHI 2013. LNCS, vol. 7946, pp. 732–747. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  22. 22.
    Soukoreff, R.W., MacKenzie, I.S.: Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. Int. J. Hum. Comput. Stud. 61(6), 751–789 (2004)CrossRefGoogle Scholar
  23. 23.
    Vatavu, R.-D.: A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments. J. Ambient Intell. Smart Environ. 5(2), 187–211 (2013)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  • Katrin Plaumann
    • 1
  • Jan Ehlers
    • 2
  • Florian Geiselhart
    • 1
  • Gabriel Yuras
    • 2
  • Anke Huckauf
    • 2
  • Enrico Rukzio
    • 1
  1. 1.Institute of Media InformaticsUlm UniversityUlmGermany
  2. 2.Institute of Psychology and EducationUlm UniversityUlmGermany

Personalised recommendations