A Comparative Study for Natural Reach-and-Grasp With/Without Finger Tracking

  • Huagen Wan
  • Xiaoxia Han
  • Wenfeng Chen
  • Yangzi Ding
  • Liezhong Ge
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 822)


Grasp interactions often involve both hand tracking and finger tracking to drive the virtual hand deformation and evaluate grasping conditions. With the more involvement of psychology into HCI technology, we are seeing more algorithms employing psychological finding. However, the performances and user experiences of these algorithms remain to be further explored. In this paper, a comparative study has been performed under the same grasping conditions between our formerly proposed method for reach-and-grasp tasks which needs only tracking the hand’s 6-dof motions (Method A) and a typical forward-kinematics enabled virtual grasping method which needs both 6-dof hand tracking and a dataglove for finger tracking (Method B). Virtual spheres centered at the origin with different diameters (i.e., 6 cm, 8 cm and 10 cm) were used as the grasping targets. A panel of 12 participants were divided into two groups and employed in the comparative study on task completion time, accuracy and 3 subjective criteria. It is shown from the experimental results that Method A is better than Method B as far as the above 3 aspects were concerned for simple shapes such as spheres. A demo application was developed using both Method A and Method B, and users’ preference evaluation was performed.


Reach-and-Grasp Grasp trajectory Finger tracking 


  1. 1.
    CyberGlove Systems. Accessed 26 May 2018
  2. 2.
    Qian C, Sun X, Wei Y, Tang X, Sun J (2014) Real-time and robust hand tracking from depth. In: Conferences on computer vision and recognition, IEEE, 2014, Columbus, Ohio, pp 1106–1113Google Scholar
  3. 3.
    Leap Motion. Accessed 26 May 2018
  4. 4.
    Kamper D, Cruz E, Siegel M (2003) Stereoscopical finger trajectories during grasp. J Neurophysiol 90(1):3702–3710CrossRefGoogle Scholar
  5. 5.
    Maitland ME, Epstein MB (2009) Analysis of finger position during two- and three-fingered grasp: possible implications for terminal device design. J Prosthet Orthot 21(2):102–105CrossRefGoogle Scholar
  6. 6.
    Zhu Z, Gao S, Wan H, Yang W (2006) Trajectory-based grasp interaction for a virtual environment. In: 24th international conference on computer graphics, pp 300–311. Springer, Berlin (2006)Google Scholar
  7. 7.
    Ullmann T, Sauer J (2000) Intuitive virtual grasping for non haptic environments. In: 8th Pacific conferences on computer graphics and applications, Hong Kong, pp 373–381Google Scholar
  8. 8.
    Holz D, Ullrich S, Wolter M, Kuhlen T (2008) Multi-contact grasp interaction for virtual environments. J Virtual Reality Broadcast 5(7):16–26Google Scholar
  9. 9.
    Borst C, Indugula A (2005) Realistic virtual grasping. In: Conferences on virtual reality, IEEE, 2005, Arles, pp 91–98Google Scholar
  10. 10.
    Jacobs J, Froehlich B (2011) A soft hand model for physically-based manipulation of virtual objects. In: Conferences on virtual reality, IEEE, 2011, Singapore, pp 11–18Google Scholar
  11. 11.
    Moehring M, Froehlich B (2005) Pseudo-physical interaction with a virtual car interior in immersive environments. In: 9th International workshop on immersive projection technology, 11th eurographics workshop on virtual environments, Aalborg, Denmark, pp 181–189Google Scholar
  12. 12.
    Vijitha T, Kumari JP (2014) Finger tracking in real time human computer interaction. Int J Comput Sci Netw Secur 14(1):83–93Google Scholar
  13. 13.
    Togootogtokh E, Shih TK, Kumara WGCW et al (2018) 3D finger tracking and recognition image processing for real-time music playing with depth sensors. Multimed Tools Appl 77(8):9233–9248CrossRefGoogle Scholar
  14. 14.
    RAPID - Robust and Accurate Polygon Interference Detection. Accessed 26 May 2018
  15. 15.
    LaViola Jr JJ, Kruijff E, McMahan RP, Bowman D, Poupyrev IP (2017) 3D user interfaces: theory and practice, 2nd edn. ISBN-13: 978-0134034324 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Huagen Wan
    • 1
  • Xiaoxia Han
    • 1
  • Wenfeng Chen
    • 1
  • Yangzi Ding
    • 1
  • Liezhong Ge
    • 1
  1. 1.Zhejiang UniversityHangzhouChina

Personalised recommendations