Identifying the Usability Factors of Mid-Air Hand Gestures for 3D Virtual Model Manipulation

  • Li-Chieh Chen
  • Yun-Maw Cheng
  • Po-Ying Chu
  • Frode Eika Sandnes
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10278)

Abstract

Although manipulating 3D virtual models with mid-air hand gestures had the benefits of natural interactions and free from the sanitation problems of touch surfaces, many factors could influence the usability of such an interaction paradigm. In this research, the authors conducted experiments to study the vision-based mid-air hand gestures for scaling, translating, and rotating a 3D virtual car displayed on a large screen. An Intel RealSense 3D Camera was employed for hand gesture recognition. The two-hand gesture with grabbing then moving apart/close to each other was applied to enlarging/shrinking the 3D virtual car. The one-hand gesture with grabbing then moving was applied to translating a car component. The two-hand gesture with grabbing and moving relatively along the circumference of a horizontal circle was applied to rotating the car. Seventeen graduate students were invited to participate in the experiments and offer their evaluations and comments for gesture usability. The results indicated that the width and depth of detection ranges were the key usability factors for two-hand gestures with linear motions. For dynamic gestures with quick transitions and motions from open to close hand poses, ensuring gesture recognition robustness was extremely important. Furthermore, given a gesture with ergonomic postures, inappropriate control-response ratio could result in fatigue due to repetitive exertions of hand gestures for achieving the precise controls of 3D model manipulation tasks.

Keywords

Mid-air hand gesture 3D virtual model manipulation Scaling Translation Rotation 

Notes

Acknowledgement

The authors would like to express our gratitude to the Ministry of Science and Technology of the Republic of China for financially supporting this research under Grant No. MOST 105-2221-E-036-009.

References

  1. 1.
    Song, P., Goh, W.B., Hutama, W., Fu, C., Liu, X.: A handle bar metaphor for virtual object manipulation with mid-air interaction. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems - CHI 2012 (2012). doi: 10.1145/2207676.2208585
  2. 2.
    Groenewald, C., Anslow, C., Islam, J., Rooney, C., Passmore, P., Wong, W.: Understanding 3D mid-air hand gestures with interactive surfaces and displays: a systematic literature review. In: Proceedings of the 30th International BCS Human Computer Interaction Conference (BCS HCI 2016), 11-15 July 2016. Bournemouth University, Poole doi: 10.14236/ewic/hci2016.43
  3. 3.
    Hsu, F., Lin, W.: A multimedia presentation system using a 3D gesture interface in museums. Multimed. Tools Appl. 69(1), 53–77 (2012). doi: 10.1007/s11042-012-1205-y CrossRefGoogle Scholar
  4. 4.
    Caputo, F.M., Ciortan, I.M., Corsi, D., De Stefani, M., Giachetti, A.: Gestural interaction and navigation techniques for virtual museum experiences. Zenodo (2016). http://doi.org/10.5281/zenodo.59882
  5. 5.
    O’Hara, K., Gonzalez, G., Sellen, A., Penney, G., Varnavas, A., Mentis, H., Criminisi, A., Corish, R., Rouncefield, M., Dastur, N., Carrell, T.: Touchless interaction in surgery. Commun. ACM 57(1), 70–77 (2014)CrossRefGoogle Scholar
  6. 6.
    Rosa, G.M., Elizondo, M.L.: Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imaging Sci. Dentistry 44(2), 155 (2014). doi: 10.5624/isd.2014.44.2.155 CrossRefGoogle Scholar
  7. 7.
    Rossol, N., Cheng, I., Shen, R., Basu, A.: Touchfree medical interfaces. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (2014). doi: 10.1109/embc.2014.6945140
  8. 8.
    Hettig, J., Mewes, A., Riabikin, O., Skalej, M., Preim, B., Hansen, C.: Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: Eurographics Workshop on Visual Computing for Biology and Medicine (2015)Google Scholar
  9. 9.
    Chattopadhyay, D., Bolchini, D.: Understanding visual feedback in large-display touchless Interactions: an exploratory study. [Research Report] IUPUI Scholar Works, Indiana University (2014)Google Scholar
  10. 10.
    Ackad, C., Clayphan, A., Tomitsch, M., Kay, J.: An in-the-wild study of learning mid-air gestures to browse hierarchical information at a large interactive public display. In: Ubicomp 2015, 7-11 September 2015, Osaka, Japan (2015)Google Scholar
  11. 11.
    Vinayak, Ramani, K.: A gesture-free geometric approach for mid-air expression of design intent in 3D virtual pottery. Comput. Aided Des. 69, 11–24 (2015). doi: 10.1016/j.cad.2015.06.006
  12. 12.
    Vinayak, Ramani, K.: Extracting hand grasp and motion for intent expression in mid-air shape deformation: a concrete and iterative exploration through a virtual pottery application. Comput. Graph. 55, 143–156 (2016). doi: 10.1016/j.cag.2015.10.012
  13. 13.
    Cui, J., Fellner, Dieter W., Kuijper, A., Sourin, A.: Mid-air gestures for virtual modeling with leap motion. In: Streitz, N., Markopoulos, P. (eds.) DAPI 2016. LNCS, vol. 9749, pp. 221–230. Springer, Cham (2016). doi: 10.1007/978-3-319-39862-4_21 CrossRefGoogle Scholar
  14. 14.
    Nakazato, K., Nishino, H., Kodama, T.: A desktop 3D modeling system controllable by mid-air interactions. In: 2016 10th International Conference on Complex, Intelligent, and Software Intensive Systems (CISIS) (2016). doi: 10.1109/cisis.2016.80
  15. 15.
    Aigner, R., Wigdor, D., Benko, H., Haller, M., Lindlbauer, D., Ion, A., Zhao, S., Koh, J.: Understanding mid-air hand gestures: a study of human preferences in usage of gesture types for HCI. Microsoft Research Technical report MSR-TR-2012-111 (2012). http://research.microsoft.com/apps/pubs/default.aspx?id=175454
  16. 16.
    Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., Mackay, W.: Mid-air pan-and-zoom on wall-sized displays. In: Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems - CHI 2011 (2011). doi: 10.1145/1978942.1978969
  17. 17.
    LaViola, Jr., J.J.: 3D gestural interaction: the state of the field. ISRN Artif. Intell., Article ID 514641 (2013)Google Scholar
  18. 18.
    Pisharady, P.K., Saerbeck, M.: Recent methods and databases in vision-based hand gesture recognition: a review. Comput. Vis. Image Underst. 141, 152–165 (2015). Pose & GestureCrossRefGoogle Scholar
  19. 19.
    Pereira, A., Wachs, J.P., Park, K., Rempel, D.: A user-developed 3-d hand gesture set for human-computer interaction. Hum. Factors 57(4), 607–621 (2015). doi: 10.1177/0018720814559307 CrossRefGoogle Scholar
  20. 20.
    Choi, E., Kim, H., Chung, M.K.: A taxonomy and notation method for three-dimensional hand gestures. Int. J. Ind. Ergon. 44(1), 171–188 (2014). doi: 10.1016/j.ergon.2013.10.011 CrossRefGoogle Scholar
  21. 21.
    Mendes, D., Relvas, F., Ferreira, A., Jorge, J.: The benefits of DOF separation in mid-air 3D object manipulation. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology - VRST 2016 (2016). doi: 10.1145/2993369.2993396
  22. 22.
    Cui, J., Kuijper, A., Fellner, D.W., Sourin, A.: Understanding people’s mental models of mid-air interaction for virtual assembly and shape modeling. In: Proceedings of the 29th International Conference on Computer Animation and Social Agents - CASA 2016 (2016). doi: 10.1145/2915926.2919330
  23. 23.
    Nunnari, F., Bachynskyi, M., Heloir, A.: Introducing postural variability improves the distribution of muscular loads during mid-air gestural interaction. In: Proceedings of the 9th International Conference on Motion in Games - MIG 2016 (2016). doi: 10.1145/2994258.2994278
  24. 24.
    Smedt, Q.D., Wannous, H., Vandeborre, J.: Skeleton-based dynamic hand gesture recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) (2016). doi: 10.1109/cvprw.2016.153
  25. 25.
    Fonseca, F., Ferreira, A., Mendes, D., Jorge, J., Araújo, B.: 3D mid-air manipulation techniques above stereoscopic tabletops. In: ISIS3D Workshop in Conjunction with ITS 2013, 6 October 2013, Scotland, UK (2013)Google Scholar
  26. 26.
    Wittorf, M.L., Jakobsen, M.R.: Eliciting mid-air gestures for wall-display interaction. In: Proceedings of the 9th Nordic Conference on Human-Computer Interaction - NordiCHI 2016 (2016). doi: 10.1145/2971485.2971503

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Li-Chieh Chen
    • 1
  • Yun-Maw Cheng
    • 2
  • Po-Ying Chu
    • 1
  • Frode Eika Sandnes
    • 3
  1. 1.Department of Industrial DesignTatung UniversityTaipeiTaiwan
  2. 2.Department of Computer Science and Engineering, Graduate Institute of Design ScienceTatung UniversityTaipeiTaiwan
  3. 3.Oslo and Akershus University College of Applied SciencesOsloNorway

Personalised recommendations