Cognition, Technology & Work

, Volume 20, Issue 1, pp 11–22 | Cite as

Exploring a user-defined gesture vocabulary for descriptive mid-air interactions

Original Article
  • 75 Downloads

Abstract

Gesturing provides an alternative interaction input for design that is more natural and intuitive. However, standard input devices do not completely reflect natural hand motions in design. A key challenge lies in how gesturing can contribute to human–computer interaction, as well as understanding the patterns in gestures. This paper aims to analyze human gestures to define a gesture vocabulary for descriptive mid-air interactions in a virtual reality environment. We conducted experiments with twenty participants describing two chairs (simple and abstract) with different levels of complexity. This paper presents a detailed analysis of gesture distribution and hand preferences for each description task. Comparisons are drawn between the proposed approach to the definition of a vocabulary using combined gestures (GestAlt) and previously suggested methods. The findings state that GestAlt is successful in describing the employed gestures in both tasks (60% of all gestures for simple chair and 69% for abstract chair). The findings can be applied to the development of an intuitive mid-air interface using gesture recognition.

Keywords

Human–computer interaction User-centered design Gesture vocabulary Gesture recognition Interface design Virtual reality 

References

  1. Andreopoulos A, Tsotsos JK (2013) 50 years of object recognition: directions forward. Comput Vis Image Underst 117(8):827–891CrossRefGoogle Scholar
  2. Avery J, Lank E (2016) surveying expert-level gesture use and adoption on multi-touch tablets. Paper presented at the Proceedings of the 2016 ACM Conference on Designing Interactive SystemsGoogle Scholar
  3. Billinghurst M, Piumsomboon T, Bai H (2014) Hands in Space: Gesture Interaction with Augmented-Reality Interfaces. IEEE Comput Graph Appl 34(1):77–80CrossRefGoogle Scholar
  4. Boyali A, Kavakli M (2012) A robust gesture recognition algorithm based on sparse representation, random projections and compressed sensing. In: Paper presented at the 2012 7th IEEE conference on industrial electronics and applications (ICIEA)Google Scholar
  5. Chandrasegaran SK, Ramani K, Sriram RD, Horváth I, Bernard A, Harik RF, Gao W (2013) The evolution, challenges, and future of knowledge representation in product design systems. Comput Aided Des 45(2):204–228CrossRefGoogle Scholar
  6. Cheng J, Xie C, Bian W, Tao D (2012) Feature fusion for 3D hand gesture recognition by learning a shared hidden space. Pattern Recogn Lett 33(4):476–484CrossRefGoogle Scholar
  7. Company P, Contero M, Varley P, Aleixos N, Naya F (2009) Computer-aided sketching as a tool to promote innovation in the new product development process. Comput Ind 60(8):592–603CrossRefGoogle Scholar
  8. Dittmar T, Krull C, Horton G (2015) A new approach for touch gesture recognition: Conversive Hidden non-Markovian Models. J Comput Sci 10:66CrossRefGoogle Scholar
  9. Eitz M, Hildebrand K, Boubekeur T, Alexa M (2011) Sketch-based image retrieval: benchmark and bag-of-features descriptors. IEEE Trans Vis Comput Graph 17(11):1624–1636CrossRefGoogle Scholar
  10. Eitz M, Hays J, Alexa M (2012) How do humans sketch objects? ACM Trans Graph 31(4):44Google Scholar
  11. Frisch M, Heydekorn J, Dachselt R (2009) Investigating multi-touch and pen gestures for diagram editing on interactive surfaces. Paper presented at the proceedings of the ACM international conference on interactive tabletops and surfacesGoogle Scholar
  12. Ha T, Billinghurst M, Woo W (2012) An interactive 3D movement path manipulation method in an augmented reality environment. Interact Comput 24(1):10–24CrossRefGoogle Scholar
  13. Hammond TA, Logsdon D, Paulson B, Johnston J, Peschel JM, Wolin A, Taele P (2010) A sketch recognition system for recognizing free-hand course of action diagrams. Paper presented at the IAAIGoogle Scholar
  14. Harris A, Rick J, Bonnett V, Yuill N, Fleck R, Marshall P, Rogers Y (2009) Around the table: are multiple-touch surfaces better than single-touch for children’s collaborative interactions? Paper presented at the proceedings of the 9th international conference on computer supported collaborative learning-volume 1Google Scholar
  15. Horváth I, Vroom RW (2015) Ubiquitous computer aided design: a broken promise or a Sleeping Beauty? Comput Aided Des 59:161–175CrossRefGoogle Scholar
  16. Ibrahim R, Rahimian FP (2010) Comparison of CAD and manual sketching tools for teaching architectural design. Autom Constr 19(8):978–987CrossRefGoogle Scholar
  17. Jahani H, Alyamani HJ, Kavakli M, Dey A, Billinghurst M (2017) User Evaluation of Hand Gestures for Designing an Intelligent In-Vehicle Interface. In: International Conference on Design Science Research in Information Systems. Springer, Cham, pp 104–121Google Scholar
  18. Jahani-Fariman H, Kavakli M, Boyali A (2017) MATRACK: block sparse Bayesian learning for a sketch recognition approach. Multimed Tools Appl 1–16Google Scholar
  19. Kavakli M (2008) Gesture recognition in virtual reality. Int J Arts Technol 1(2):215–229CrossRefGoogle Scholar
  20. Kavakli M, Scrivener SA, Ball LJ (1998) Structure in idea sketching behaviour. Des Stud 19(4):485–517CrossRefGoogle Scholar
  21. Kavakli M, Taylor M, Trapeznikov A (2007) Designing in virtual reality (DesIRe): a gesture-based interface. Paper presented at the proceedings of the 2nd international conference on digital interactive media in entertainment and artsGoogle Scholar
  22. Kharrufa A, Leat D, Olivier P (2010) Digital mysteries: designing for learning at the tabletop. Paper presented at the ACM international conference on interactive tabletops and surfacesGoogle Scholar
  23. Kim Y-Y, Lee M-W, Park J-Y, Jung S-H, Kim K-H, Cha J-S (2015) Design of exhibition contents using swipe gesture recognition communication based on Kinect. Paper presented at the 2015 international conference on information networking (ICOIN)Google Scholar
  24. Kühnel C, Westermann T, Hemmert F, Kratz S, Müller A, Möller S (2011) I’m home: defining and evaluating a gesture set for smart-home control. Int J Hum Comput Stud 69(11):693–704CrossRefGoogle Scholar
  25. LaViola Jr JJ (2015) Context aware 3D gesture recognition for games and virtual reality. Paper presented at the ACM SIGGRAPH 2015 CoursesGoogle Scholar
  26. Li Y, Hospedales TM, Song Y-Z, Gong S (2015) Free-hand sketch recognition by multi-kernel feature learning. Comput Vis Image Underst 137:1CrossRefGoogle Scholar
  27. Micire M, Desai M, Courtemanche A, Tsui KM, Yanco HA (2009) Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces. Paper presented at the proceedings of the ACM international conference on interactive tabletops and surfacesGoogle Scholar
  28. Nielsen M, Störring M, Moeslund TB, Granum E (2004) A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: Camurri A, Volpe G (eds) Gesture-based communication in human–computer interaction. Springer, Berlin, pp 409–420Google Scholar
  29. Obaid M, Häring M, Kistler F, Bühling R, André E (2012) User-defined body gestures for navigational control of a humanoid robot. In: Ge SS, Khatib O, Cabibihan JJ, Simmons R, Williams MA (eds) Social robotics. ICSR 2012. Lecture Notes in Computer Science, vol 7621. Springer, Berlin, pp 367–377Google Scholar
  30. Oberhauser M, Dreyer D (2017) A virtual reality flight simulator for human factors engineering. Cogn Technol Work.  https://doi.org/10.1007/s10111-017-0421-7 Google Scholar
  31. Prasad S, Kumar P, Sinha KP (2014) A wireless dynamic gesture user interface for HCI using hand data glove. Paper presented at the 2014 seventh international conference on contemporary computing (IC3)Google Scholar
  32. Ramani K (2015) A gesture-free geometric approach for mid-air expression of design intent in 3D virtual pottery. Comput Aided Des 69:11–24CrossRefGoogle Scholar
  33. Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43(1):1–54CrossRefGoogle Scholar
  34. Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. Paper presented at the proceedings of the SIGCHI conference on human factors in computing systemsGoogle Scholar
  35. Sætren GB, Hogenboom S, Laumann K (2016) A study of a technological development process: human factors—the forgotten factors? Cogn Technol Work 18(3):595–611.  https://doi.org/10.1007/s10111-016-0379-x CrossRefGoogle Scholar
  36. Schutte PC (2017) How to make the most of your human: design considerations for human–machine interactions. Cogn Technol Work.  https://doi.org/10.1007/s10111-017-0418-2 Google Scholar
  37. Seyed T, Burns C, Costa Sousa M, Maurer F, Tang A (2012) Eliciting usable gestures for multi-display environments. Paper presented at the proceedings of the 2012 ACM international conference on Interactive tabletops and surfacesGoogle Scholar
  38. Silpasuwanchai C, Ren X (2015) Designing concurrent full-body gestures for intense gameplay. Int J Hum Comput Stud 80:1–13CrossRefGoogle Scholar
  39. Spanogianopoulos S, Sirlantzis K, Mentzelopoulos M, Protopsaltis A (2014) Human computer interaction using gestures for mobile devices and serious games: a review. In: 2014 international conference on Interactive mobile communication technologies and learning (IMCL), IEEE, pp 310–314Google Scholar
  40. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. Paper presented at the proceedings of the SIGCHI conference on human factors in computing systemsGoogle Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2017

Authors and Affiliations

  1. 1.VISOR Research Group, Department of Computing, Faculty of Science and EngineeringMacquarie UniversitySydneyAustralia

Personalised recommendations