Advertisement

The Influence of the Threshold of the Size of the Graphic Element on the General Dynamic Gesture Behavior

  • Ming Hao
  • Zhou XiaozhouEmail author
  • Xue Chengqi
  • Xiao Weiye
  • Jia Lesong
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1026)

Abstract

Nowadays, the image clarity and reality of augmented reality and virtual reality are constantly improving. However, The interaction in 3D space still relies on the handle or other mechanical objects to operate it. Therefore, how to interact with interfaces and objects in three-dimensional space in a natural way is a problem that current researchers will consider. The research direction of this paper is to explore the influence of the size of the graphic element in the interactive interface on the dynamic gesture behavior through the behavior experiment of the user. In the paper, the researcher observes the gestures when interacting with objects of the different size in the virtual space, and then, researchers analyzes the correlation between the size of object and the dynamic gesture. The correlation between the size of the graphic element presentation and the dynamic gesture behavior is obtained.

Keywords

Virtual reality Natural gesture interaction 3D interactive space Graphic element rendering size Dynamic gesture recognition Leap motion HTC vive 

Notes

Acknowledgments

The authors wish to thank Science and Technology on Avionics Integration Laboratory and Aeronautical Science Fund (No. 20185569008), supported by “the Fundamental Research Funds for the Central Universities”, National Natural Science Foundation of China (no. 71871056), National Natural Science Foundation of China (no. 71471037).

References

  1. 1.
    Billinghurst, M., Clark, A., Lee, G.: A Survey of Augmented Reality. Now Publishers Inc., Delft (2015)CrossRefGoogle Scholar
  2. 2.
    Ong, S.K., Yuan, M.L., Nee, A.Y.C.: Augmented reality applications in manufacturing: a survey. Int. J. Prod. Res. 46, 2707–2742 (2008)CrossRefGoogle Scholar
  3. 3.
    Swan II, J.E., Interrante, V.: Special section: best short papers from IEEE virtual reality 2015 guest editors’ introduction. Presence 25, iii (2016)CrossRefGoogle Scholar
  4. 4.
    Chen, C.: Research on natural gesture interaction technology based on LeapMotion in industrial robot teaching. Doctoral dissertationGoogle Scholar
  5. 5.
    Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C 37, 311–324 (2007)CrossRefGoogle Scholar
  6. 6.
    Sun, H.M., Li, S.P.: The effect of user’s perceived presence and promotion focus on usability for interacting in virtual environments. Appl. Ergon. 50, 126–132 (2015)CrossRefGoogle Scholar
  7. 7.
    Yuan, M.L., Ong, S.K., Nee, A.Y.C.: Registration using natural features for augmented reality systems. IEEE Trans. Visual. Comput. Graphics 12, 560–580 (2006)Google Scholar
  8. 8.
    Sidharta, R.: Augmented reality tangible interfaces for CAD design review. Masters thesis, Iowa State University (2002)Google Scholar
  9. 9.
    Valentini, P.P.: Natural interface in augmented reality interactive simulations. Virtual Phys. Prototyp. 7, 137–151 (2012)CrossRefGoogle Scholar
  10. 10.
    Gaffary, Y., Gouis, B.L., Marchal, M.: AR feels “softer” than VR: haptic perception of stiffness in augmented versus virtual reality. IEEE Trans. Visual. Comput. Graphics 23, 2372–2377 (2017)CrossRefGoogle Scholar
  11. 11.
    James, B., Seung, Y.S.: Cognitive cost of using augmented reality displays. IEEE Trans. Visual. Comput. Graphics 23, 2378–2388 (2017)CrossRefGoogle Scholar
  12. 12.
    Chen, J.: Assessing the use of immersive virtual reality, mouse and touchscreen in pointing and dragging-and-dropping tasks among young, middle-aged and older adults. Appl. Ergon. 65, 437–448 (2017)CrossRefGoogle Scholar
  13. 13.
    Lindgren, R., Tscholl, M.: Enhancing learning and engagement through embodied interaction within a mixed reality simulation. Comput. Educ. 95, 174–187 (2016)CrossRefGoogle Scholar
  14. 14.
    Feng, Z., Yang, B., Xu, T.: Direct operation type 3D human-computer interaction paradigm based on natural gesture tracking. CJC 37, 1309–1323 (2014)Google Scholar
  15. 15.
    Song, W., Cai, X., Xi, Y.: Real-time single camera natural user interface engine development. MTA 76, 11159–11175 (2017)Google Scholar
  16. 16.
    Hou, Y., Zhou, H., Wang, Z.: Review of research progress in deep learning in speech recognition. JCA 34, 2241–2246 (2017)Google Scholar
  17. 17.
    Caggianese, G., Gallo, L., Neroni, P.: Evaluation of spatial interaction techniques for virtual heritage applications: a case study of an interactive holographic projection. FGCS 81, 516–527 (2018)CrossRefGoogle Scholar
  18. 18.
    Nickel, K., Stiefelhagen, R.: Visual recognition of pointing gestures for human–robot interaction. IVC 25, 1875–1884 (2007)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Ming Hao
    • 1
  • Zhou Xiaozhou
    • 1
    Email author
  • Xue Chengqi
    • 1
  • Xiao Weiye
    • 1
  • Jia Lesong
    • 1
  1. 1.School of Mechanical EngineeringSoutheast UniversityNanjingChina

Personalised recommendations