Hand tracking with physiological constraints

Original Article

Abstract

Articulated hand tracking systems have been commonly used in virtual reality applications, including systems with human–computer interaction or interaction with game consoles; hand pose estimation has various other applications comprising sign language recognition and animation synthesis. The advanced technological achievements in motion capture over the last decade allow data acquisition with high accuracy and low cost. However, due to the high complexity of the human hand, it is still challenging to animate a hand model able to deal in details with the anatomical and physiological constraints of the hand. In this paper, we present a simple and efficient methodology for tracking and reconstructing 3D hand poses. Using an optical motion capture system, where markers are positioned at strategic points, we manage to acquire the movement of the hand and establish its orientation using a minimum number of markers. An Inverse Kinematics solver was then employed to control the postures of the hand, subject to physiological constraints that restrict the allowed movements to a feasible and natural set. The proposed methodology produces smooth and biomechanically correct movements, while the required processing time remains low, enabling an effective real-time hand motion tracking and reconstruction system.

Keywords

Geometric Algebra Hand Tracking Inverse Kinematics Motion Capture Physiological Constraints 

Supplementary material

371_2016_1327_MOESM1_ESM.avi (11.6 mb)
Supplementary material 1 (avi 11919 KB)

References

  1. 1.
    Albrecht, I., Haber, J., Seidel, H.P.: Construction and animation of anatomically based human hand models. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’03, pp. 98–109. Eurographics Association (2003)Google Scholar
  2. 2.
    An, K., Ueba, Y., Chao, E., Cooney, W., Linscheid, R.: Tendon excursion and moment arm of index finger muscles. J. Biomech. 16(6), 419–425 (1992)CrossRefGoogle Scholar
  3. 3.
    Aristidou, A., Chrysanthou, Y., Lasenby, J.: Extending FABRIK with model constraints. Comp. Anim. Virt. Worlds 27(1), 3557 (2016)Google Scholar
  4. 4.
    Aristidou, A., Lasenby, J.: Motion capture with constrained inverse kinematics for real-time hand tracking. In: Proceedings of the 4th International Symposium on Communications. Control and Signal Processing, ISCCSP ’10, pp. 1–5. IEEE, Limassol, Cyprus (2010)Google Scholar
  5. 5.
    Aristidou, A., Lasenby, J.: FABRIK: A fast, iterative solver for the inverse kinematics problem. Graph. Models 73(5), 243–260 (2011)CrossRefGoogle Scholar
  6. 6.
    Aristidou, A., Lasenby, J.: Real-time marker prediction and Co R estimation in optical motion capture. Visual Comp. 29(1), 7–26 (2013)CrossRefGoogle Scholar
  7. 7.
    Baran, I., Popović, J.: Automatic rigging and animation of 3d characters. ACM Trans. Graph. 26(3) (2007)Google Scholar
  8. 8.
    Brüser, P., Gilbert, A.: Finger bone and joint injuries. Taylor & Francis, UK (1999)Google Scholar
  9. 9.
    Buchholz, B., Armstrong, T.J.: A kinematic model of the human hand to evaluate its prehensile capabilities. J. Biomech. 25(2), 149–162 (1992)CrossRefGoogle Scholar
  10. 10.
    Cerveri, P., De Momi, E., Lopomo, N., Baud-Bovy, G., Barros, R.M.L., Ferrigno, G.: Finger kinematic modeling and real-time hand motion estimation. Ann. Biomed. Eng. 35(11), 1989–2002 (2007)CrossRefGoogle Scholar
  11. 11.
    Chen, W., Fujiki, R., Arita, D., Taniguchi, R.i.: Real-time 3d hand shape estimation based on image feature analysis and inverse kinematics. In: Proceedings of the International Conference on Image Analysis and Processing, ICIAP ’07, pp. 247–252. IEEE Computer Society (2007)Google Scholar
  12. 12.
    CyberGlove Systems. http://www.cyberglovesystems.com/ (2016). Accessed: 2016-09
  13. 13.
    Dewaele, G., Devernay, F., Horaud, R.P., Forbes, F.: The alignment between 3-d data and articulated shapes with bending surfaces. In: Proceedings of the 9th European Conference on Computer Vision, Graz, Austria, LNCS, vol. III, pp. 578–591. Springer (2006)Google Scholar
  14. 14.
    Doran, C., Lasenby, A.: Geomet. Algeb. Phys. Cambridge University Press, Cambridge UK (2003)CrossRefGoogle Scholar
  15. 15.
    Feng, Z., Yang, B., Li, Y., Zheng, Y., Zhao, X., Yin, J., Meng, Q.: Real-time oriented behavior-driven 3d freehand tracking for direct interaction. Patt. Recogn. 46(2), 590–608 (2013)CrossRefMATHGoogle Scholar
  16. 16.
    Fredriksson, J., Ryen, S.B., Fjeld, M.: Real-time 3d hand-computer interaction: optimization and complexity reduction. In: Proceedings of the Nordic conference on Human-computer interaction, pp. 133–141. ACM (2008)Google Scholar
  17. 17.
    Fujiki, R., Arita, D., Taniguchi, R.i.: Real-time 3d hand shape estimation based on inverse kinematics and physical constraints. In: Proceedings of the International Conference on Image Analysis and Processing, ICIAP ’05, vol. 3617, pp. 850–858. Springer (2005)Google Scholar
  18. 18.
    Girshick, R., Shotton, J., Kohli, P., Criminisi, A., Fitzgibbon, A.: Efficient regression of general-activity human poses from depth images. In: Proceedings of the International Conference on Computer Vision, ICCV ’11, pp. 415–422. IEEE Computer Society (2011)Google Scholar
  19. 19.
    Guan, H., Feris, R., Turk, M.: The isometric self-organizing map for 3d hand pose estimation. In: Proceedings of the International Conference on Automatic Face and Gesture Recognition, FGR ’06, pp. 263–268 (2006)Google Scholar
  20. 20.
    Hestens, D., Sobczyk, G.: Clifford algebra to geometric calculus: a unified language for mathematics and physics. D. Reidel, Dordrecht (1984)CrossRefGoogle Scholar
  21. 21.
    Hoyet, L., Ryall, K., McDonnell, R., O’Sullivan, C.: Sleight of hand: Perception of finger motion from reduced marker sets. In: Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games. I3D ’12, pp. 79–86. ACM, New York, NY, USA (2012)Google Scholar
  22. 22.
    Jörg, S., Hodgins, J., Safonova, A.: Data-driven finger motion synthesis for gesturing characters. ACM Trans. Graph. 31(6), 189:1–189:7 (2012)CrossRefGoogle Scholar
  23. 23.
    Kahlesz, F., Zachmann, G., Klein, R.: ’visual-fidelity’ dataglove calibration. In: Proceedings of the Computer Graphics International, CGI ’04, pp. 403–410. IEEE Comp. Soc. (2004)Google Scholar
  24. 24.
    Kaimakis, P., Lasenby, J.: Gradient-based hand tracking using silhouette data. In: Proceeding of the 3rd International Symposium on Visual Computing, vol. 1, pp. 24–35. Lake Tahoe, NV/CA, USA (2007)Google Scholar
  25. 25.
    Kaimakis, P., Lasenby, J.: Physiological modelling for improved reliability in silhouette-driven gradient-based hand tracking. In: Proceedings of the International Conference on Computer Vision and Pattern Recognition. CVPR ’09, pp. 19–26. FL, USA, Miami (2009)Google Scholar
  26. 26.
    Kerdvibulvech, C., Saito, H.: Model-based hand tracking by chamfer distance and adaptive color learning using particle filter. J. Image Video Process. 2009, 6:2–6:12 (2009)Google Scholar
  27. 27.
    Kitagawa, M., Windsor, B.: MoCap for artists: workflow and tech. for motion capture. Focal Press, UK (2008)Google Scholar
  28. 28.
    Krejov, P., Bowden, R.: Multi-touchless: Real-time fingertip detection and tracking using geodesic maxima. In: IEEE International Conference and Workshops on Automatic Face and Gesture Recognition, FG ’13, pp. 1–7 (2013)Google Scholar
  29. 29.
    de La Gorce, M., Fleet, D.J., Paragios, N.: Model-based 3d hand pose estimation from monocular video. IEEE Trans. Patt. Anal. Mach. Intel. 33(9), 1793–1805 (2011)CrossRefGoogle Scholar
  30. 30.
    de La Gorce, M., Paragios, N., Fleet, D.J.: Model-based hand tracking with texture, shading and self-occlusions. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–8 (2008)Google Scholar
  31. 31.
    Lasenby, A.N., Lasenby, J., Wareham, R.: A covariant approach to geometry using geometric algebra. Tech. Rep. F-INFENG/TR-483, Cambridge University Engineering Department (2004)Google Scholar
  32. 32.
    Lasenby, J., Fitzgerald, W.J., Lasenby, A.N., Doran, C.J.L.: New geometric methods for computer vision: An application to structure and motion estimation. Int. J. Comp. Vision 26(3), 191–213 (1998)CrossRefGoogle Scholar
  33. 33.
    Leap Motion Inc. http://www.leapmotion.com/ (2016). Accessed: 2016-09
  34. 34.
    Lee, C., Ghyme, S., Park, C., Wohn, K.: The control of avatar motion using hand gesture. In: Proceedings of the Symposium on Virtual Reality Software and Technology. VRST ’98, pp. 59–65. ACM, Taipei, Taiwan (1998)Google Scholar
  35. 35.
    Lee, J., Chai, J., Reitsma, P.S.A., Hodgins, J.K., Pollard, N.S.: Interactive control of avatars animated with human motion data. ACM Trans. Graph. 21(3), 491–500 (2002)Google Scholar
  36. 36.
    Liang, H., Yuan, J., Thalmann, D.: Parsing the hand in depth images. IEEE Trans. Multi. 16(5), 1241–1253 (2014)CrossRefGoogle Scholar
  37. 37.
    Liang, H., Yuan, J., Thalmann, D.: Resolving ambiguous hand pose predictions by exploiting part correlations. IEEE Trans. Circ. Syst. Video Technol. 99, 1–14 (2014)Google Scholar
  38. 38.
    Liang, H., Yuan, J., Thalmann, D., Zhang, Z.: Model-based hand pose estimation via spatial-temporal hand parsing and 3d fingertip localization. Visual Comp. 29(6–8), 837–848 (2013)CrossRefGoogle Scholar
  39. 39.
    Lien, C.C., Huang, C.L.: Model-based articulated hand motion tracking for gesture recognition. Image Vision Comp. 16(2), 121–134 (1998)CrossRefGoogle Scholar
  40. 40.
    Lin, J., Wu, Y., Huang, T.S.: Modeling the constraints of human hand motion. In: Proceedings of the Workshop on Human Motion. HUMO ’00, pp. 121–127. IEEE Computer Society, Washington, DC, USA (2000)Google Scholar
  41. 41.
    Liu, C.K.: Synthesis of interactive hand manipulation. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’08, pp. 163–171. Eurographics Association (2008)Google Scholar
  42. 42.
    Liu, C.K.: Dextrous manipulation from a grasping pose. ACM Trans. Graph. 28(3), 59:1–59:6 (2009)Google Scholar
  43. 43.
    Ma, Z., Wu, E.: Real-time and robust hand tracking with a single depth camera. Visual Comp. 30(10), 1133–1144 (2014)CrossRefGoogle Scholar
  44. 44.
    Majkowska, A., Zordan, V.B., Faloutsos, P.: Automatic splicing for hand and body animations. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation. SCA ’06, pp. 309–316. Eurographics Association, Aire-la-Ville, Switzerland (2006)Google Scholar
  45. 45.
    Melax, S., Keselman, L., Orsten, S.: Dynamics based 3d skeletal hand tracking. In: Proceedings of Graphics Interface, GI ’13, pp. 63–70. Canadian Information Processing Society, Toronto, Ont., Canada (2013)Google Scholar
  46. 46.
    Mikić, I., Trivedi, M., Hunter, E., Cosman, P.: Human body model acquisition and tracking using voxel data. Int. J. Comp. Vision 53(3), 199–223 (2003)CrossRefMATHGoogle Scholar
  47. 47.
    Mo, Z., Neumann, U.: Real-time hand pose recognition using low-resolution depth images. In: Proceedings of the Conference on Computer Vision and Pattern Recognition - Vol. 2, CVPR ’06, pp. 1499–1505. IEEE Computer Society (2006)Google Scholar
  48. 48.
    Thalmmic labs: MYO gesture control armbands. https://www.thalmic.com/myo/ (2016). Accessed: 2016-09
  49. 49.
    Nimble VR. http://nimblevr.com/ (2016). Accessed: 2016-09
  50. 50.
    Oikonomidis, I., Kyriazis, N., Argyros, A.: Efficient model-based 3d tracking of hand articulations using kinect. In: Proceedings of the British Machine Vision Conference, BMVA ’11, pp. 101.1–101.11. BMVA Press (2011)Google Scholar
  51. 51.
    Oikonomidis, I., Kyriazis, N., Argyros, A.: Tracking the articulated motion of two strongly interacting hands. In: Proceedings of the Conference on Computer Vision and Pattern Recognition, CVPR ’12, pp. 1862–1869 (2012)Google Scholar
  52. 52.
    Park, J., Yoon, Y.L.: LED-glove based interactions in multi-modal displays for teleconferencing. In: Proceedings of the International Conference on Artificial Reality and Telexistence, ICAT ’06, pp. 395–399 (2006)Google Scholar
  53. 53.
    PhaseSpace Inc.: Motion Capture Systems. http://www.phasespace.com (2016). Accessed: 2016-09
  54. 54.
    Pollard, N.S., Zordan, V.B.: Physically based grasping control from example. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation. SCA ’05, pp. 311–318. ACM, New York, NY, USA (2005)Google Scholar
  55. 55.
    Qian, C., Sun, X., Wei, Y., Tang, X., Sun, J.: Realtime and robust hand tracking from depth. In: Proceedings of the Computer Vision and Pattern Recognition conference, CVPR ’14 (2014)Google Scholar
  56. 56.
    Rehg, J., Kanade, T.: Digiteyes: Vision-based hand tracking for human-computer interaction. In: Proceedings of the workshop on Motion of Non-Rigid and Articulated Bodies, pp. 16–22. IEEE Computer Society Press (1994)Google Scholar
  57. 57.
    Schlattman, M., Klein, R.: Simultaneous 4 gestures 6 dof real-time two-hand tracking without any markers. In: Proceedings of the Symposium on Virtual Reality Software and Technology. VRST ’07, pp. 39–42. California, USA (2007)Google Scholar
  58. 58.
    Schröder, M., Maycock, J., Botsch, M.: Reduced marker layouts for optical motion capture of hands. In: Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games. MIG ’15, pp. 7–16. ACM, New York, NY, USA (2015)Google Scholar
  59. 59.
    Shakhnarovich, G., Viola, P., Darrell, T.: Fast pose estimation with parameter-sensitive hashing. In: Proceedings of the International Conference on Computer Vision - Vol. 2, ICCV ’03, pp. 750–758. IEEE Computer Society (2003)Google Scholar
  60. 60.
    Shan, C., Tan, T., Wei, Y.: Real-time hand tracking using a mean shift embedded particle filter. Patt. Recogn. 40(7), 1958–1970 (2007)CrossRefMATHGoogle Scholar
  61. 61.
    Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Kim, D., Rhemann, C., Leichter, I., Vinnikov, A., Wei, Y., Freedman, D., Kohli, P., Krupka, E., Fitzgibbon, A., Izadi, S.: Accurate, robust, and flexible real-time hand tracking. In: Proceedings of the Conference on Human Factors in Computing Systems, CHI ’15, pp. 3633–3642. ACM (2015)Google Scholar
  62. 62.
    Shotton, J., Girshick, R., Fitzgibbon, A., Sharp, T., Cook, M., Finocchio, M., Moore, R., Kohli, P., Criminisi, A., Kipman, A., Blake, A.: Efficient human pose estimation from single depth images. Trans. Patt. Anal. Mach. Intell. 35(12) (2013)Google Scholar
  63. 63.
    Sibille, L., Teschner, M., Srivastava, S., Latombe, J.C.: Interactive simulation of the human hand. In: Lemke, H., Inamura, K., Doi, K., Vannier, M., Farman, A., Reiber, J. (eds.) Computer assisted radiology and surgery (CARS 2002), pp. 7–12. Springer, Berlin Heidelberg (2002)CrossRefGoogle Scholar
  64. 64.
    Sridhar, S., Mueller, F., Oulasvirta, A., Theobalt, C.: Fast and robust hand tracking using detection-guided optimization. In: Proceedings of Computer Vision and Pattern Recognition, CVPR ’15 (2015)Google Scholar
  65. 65.
    Stenger, B., Mendonça, P.R.S., Cipolla, R.: Model-based 3d tracking of an articulated hand. In: IEEE Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 310–315. IEEE Computer Society (2001)Google Scholar
  66. 66.
    Stenger, B., Thayananthan, A., Torr, P.H., Cipolla, R.: Model-based hand tracking using a hierarchical bayesian filter. IEEE Trans. Patt. Anal. Mach. Intell. 28(9), 1372–1384 (2006)CrossRefMATHGoogle Scholar
  67. 67.
    Sudderth, E.B., Mandel, M.I., Freeman, W.T., Willsky, A.S.: Distributed occlusion reasoning for tracking with nonparametric belief propagation. In: Neural Information Processing Systems, NIPS ’04, pp. 1369–1376. MIT Press (2004)Google Scholar
  68. 68.
    Tagliasacchi, A., Schröder, M., Tkach, A., Bouaziz, S., Botsch, M., Pauly, M.: Robust articulated-ICP for real-time hand tracking. Comp. Graph. Forum 34(5), 101–114 (2015)CrossRefGoogle Scholar
  69. 69.
    Tao, S., Yang, Y.: Collision-free motion planning of a virtual arm based on the FABRIK algorithm, pp. 1–20. Robotica (2016)Google Scholar
  70. 70.
    Tompson, J., Stein, M., Lecun, Y., Perlin, K.: Real-time continuous pose recovery of human hands using convolutional networks. ACM Trans. Graph. 33(5), 169:1–169:10 (2014)CrossRefGoogle Scholar
  71. 71.
    Tsang, W., Singh, K., Fiume, E.: Helping hand: An anatomically accurate inverse dynamics solution for unconstrained hand motion. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’05, pp. 319–328. ACM (2005)Google Scholar
  72. 72.
    Vicon Inc.: Motion capture systems. http://www.vicon.com (2016). Accessed: 2016-09
  73. 73.
    Wang, R.Y., Popović, J.: Real-time hand-tracking with a color glove. ACM Trans. Graph. 28(3), 63:1–63:8 (2009)Google Scholar
  74. 74.
    Wang, Y., Neff, M.: Data-driven glove calibration for hand motion capture. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’13, pp. 15–24. ACM (2013)Google Scholar
  75. 75.
    Wheatland, N., Wang, Y., Song, H., Neff, M., Zordan, V., Joerg, S.: State of the art in hand, finger modeling and animation. Comp. Graph. Forum 34(2) (2015)Google Scholar
  76. 76.
    Xu, C., Cheng, L.: Efficient hand pose estimation from a single depth image. In: IEEE International Conference on Computer Vision, ICCV ’13, pp. 3456–3462. IEEE Computer Society (2013)Google Scholar
  77. 77.
    Ye, Y., Liu, C.K.: Synthesis of detailed hand manipulations using contact sampling. ACM Trans. Graph. 31(4), 41:1–41:10 (2012)Google Scholar
  78. 78.
    Zhao, W., Chai, J., Xu, Y.Q.: Combining marker-based mocap and RGB-D camera for acquiring high-fidelity hand motion data. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation. SCA ’12, pp. 33–42. Eurographics Association, Aire-la-Ville, Switzerland (2012)Google Scholar
  79. 79.
    Zhao, W., Zhang, J., Min, J., Chai, J.: Robust realtime physics-based motion control for human grasping. ACM Trans. Graph. 32(6), 207:1–207:12 (2013)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity of CyprusNicosiaCyprus

Personalised recommendations