Advertisement

Exploring the Ergonomic Issues of User-Defined Mid-Air Gestures for Interactive Product Exhibition

  • Li-Chieh Chen
  • Po-Ying Chu
  • Yun-Maw Cheng
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9749)

Abstract

Recently, the applications of 3D and mid-air hand gestures have increased significantly in public and interactive display systems. Due to the context and user differences, it is necessary to consider user-defined gestures at the design stage of the system development. However, user-defined gestures may not be able to conform to the requirements of ergonomics without in-depth studies and careful selection. Therefore, the objective of this research is to develop a systematic method for extraction and evaluation of user-defined gestures from ergonomic perspectives. In this research, a behavior coding scheme was developed to analyze gestures for six tasks of interactive product exhibition. The results indicated that hand dorsiflexion caused by the posture of opening palm and facing forward was the common ergonomic issue identified from user-defined gestures. In order to reduce discomfort of prolonged gesture controls, the alternative combinations of gestures for accomplishing these tasks was determined based on ergonomic limitations and the considerations of vision-based hand gesture recognitions.

Keywords

Mid-air gesture User-defined gesture Ergonomic issues 

Notes

Acknowledgement

The authors would like to express our gratitude to the Ministry of Science and Technology of the Republic of China for financially supporting this research under Grant No. MOST 104-2221-E-036-020.

References

  1. 1.
    O’Hara, K., Gonzalez, G., Sellen, A., Penney, G., Varnavas, A., Mentis, H., Criminisi, A., Corish, R., Rouncefield, M., Dastur, N., Carrell, T.: Touchless interaction in surgery. Commun. ACM 57(1), 70–77 (2014)CrossRefGoogle Scholar
  2. 2.
    Rosa, G.M., Elizondo, M.L.: Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imaging Sci. Dent. 44, 155–160 (2014)CrossRefGoogle Scholar
  3. 3.
    Hettig, J., Mewes, A., Riabikin, O., Skalej, M., Preim, B., Hansen, C.: Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: Eurographics Workshop on Visual Computing for Biology and Medicine (2015)Google Scholar
  4. 4.
    Hsu, F.S., Lin, W.Y.: A multimedia presentation system using a 3D gesture interface in museums. Multimedia Tools Appl. 69(1), 53–77 (2014)CrossRefGoogle Scholar
  5. 5.
    Ackad, C., Clayphan, A., Tomitsch, M., Kay, J.: An in-the-wild study of learning mid-air gestures to browse hierarchical information at a large interactive public display. In: UBICOMP 2015, 7–11 September 2015, Osaka, Japan (2015)Google Scholar
  6. 6.
    Vinayak, Ramani, K.: A gesture-free geometric approach for mid-air expression of design intent in 3D virtual pottery. Comput. Aided Des. 69(2015), 11–24 (2015)CrossRefGoogle Scholar
  7. 7.
    Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., Mackay, W.: Mid-air pan-and-zoom on wall-sized displays. In: CHI 2011: Proceedings of the SIGCHI Conference on Human Factors and Computing Systems, pp. 177–186, May 2011, Vancouver, Canada (2011)Google Scholar
  8. 8.
    Aigner, R., Wigdor, D., Benko, H., Haller, M., Lindbauer, D., Ion, A., Zhao, S., Koh, J.T.K.V.: Understanding mid-air hand gestures: a study of human preferences in usage of gesture types for HCI. Microsoft Research Technical Report MSR-TR-2012-111 (2012). http://research.microsoft.com/apps/pubs/default.aspx?id=175454
  9. 9.
    LaViola Jr., J.J.: 3D gestural interaction: the state of the field. ISRN Artif. Intell. 2013(514641) (2013)Google Scholar
  10. 10.
    Pereira, A., Wachs, J.P., Park, K., Rempel, D.: A user-developed 3-D hand gesture set for human–computer interaction. Hum. Factors 57(4), 607–621 (2015)CrossRefGoogle Scholar
  11. 11.
    Choi, E., Kim, H., Chung, M.K.: A taxonomy and notation method for three-dimensional hand gestures. Int. J. Ind. Ergon. 44(1), 171–188 (2014)CrossRefGoogle Scholar
  12. 12.
    Pisharady, P.K., Saerbeck, M.: Recent methods and databases in vision-based hand gesture recognition: a review. Comput. Vis. Image Underst. 141, 152–165 (2015). Pose & GestureCrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Department of Industrial DesignTatung UniversityTaipeiTaiwan
  2. 2.Department of Computer Science and Engineering, Graduate Institute of Design ScienceTatung UniversityTaipeiTaiwan

Personalised recommendations