Rapid Finger Motion Tracking on Low-Power Mobile Environments for Large Screen Interaction

  • Yeongnam ChaeEmail author
  • Daniel Crane
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10507)


Motion and gesture are garnering significant interest as the sizes of screens are getting larger. To provide lightweight finger motion tracking on low-power mobile environments, we propose an approach that breaks down the stereotypes of camera view points. By directing the camera view angle towards the ceiling, the proposed approach can reduce the problem complexity incurred by complicated background environments. Though this change incurs poor lighting conditions for image processing, by clustering and tracking the fragmented motion blobs from the motion image of the saturation channel, rapid finger motion can be tracked efficiently with low computational load. We successfully implemented and tested the proposed approach on a low-power mobile device with a 1.5 GHz mobile processor and a low specification camera with a capture rate of under 15 fps.


Motion tracking Mobile environment Remote interface 


  1. 1.
    Bobick, A.F., Davis, J.W.: The recognition of human movement using temporal templates. In: TPAMI (2001)Google Scholar
  2. 2.
    de La Gorce, M., Fleet, D.J., Paragios, N.: Model-based 3D hand pose estimation from monocular video. In: TPAMI (2011)Google Scholar
  3. 3.
    Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J.: Accurate, robust, and flexible real-time hand tracking. In: CHI (2015)Google Scholar
  4. 4.
    Simon, T., Joo, H., Matthews, I., Sheikh, Y.: Hand keypoint detection in single images using multiview bootstrapping. In: CVPR (2017)Google Scholar
  5. 5.
    Stenger, B., Thayananthan, A., Torr, P.H., Cipolla, R.: Model-based hand tracking using a hierarchical Bayesian filter. In: TPAMI (2006)Google Scholar
  6. 6.
    Wan, C., Yao, A., Gool, L.: Hand pose estimation from local surface normals. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9907, pp. 554–569. Springer, Cham (2016). doi: 10.1007/978-3-319-46487-9_34 CrossRefGoogle Scholar
  7. 7.
    Ye, Q., Yuan, S., Kim, T.-K.: Spatial attention deep net with partial PSO for hierarchical hybrid hand pose estimation. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9912, pp. 346–361. Springer, Cham (2016). doi: 10.1007/978-3-319-46484-8_21 CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2017

Authors and Affiliations

  1. 1.Rakuten Institute of TechnologyRakuten, Inc.Setagaya-kuJapan

Personalised recommendations