Advertisement

A Thumb Tip Wearable Device Consisting of Multiple Cameras to Measure Thumb Posture

  • Naoto IenagaEmail author
  • Wataru Kawai
  • Koji Fujita
  • Natsuki Miyata
  • Yuta Sugiura
  • Hideo Saito
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11367)

Abstract

Today, cameras have become smaller and cheaper and can be utilized in various scenes. We took advantage of that to develop a thumb tip wearable device to estimate joint angles of a thumb as measuring human finger postures is important in terms of human-computer interface and to analyze human behavior. The device we developed consists of three small cameras attached at different angles so the cameras can capture the four fingers. We assumed that the appearance of the four fingers would change depending on the joint angles of the thumb. We made a convolutional neural network learn a regression relationship between the joint angles of the thumb and the images taken by the cameras. In this paper, we captured the keypoint positions of the thumb with a USB sensor device and calculated the joint angles to construct a dataset. The root mean squared error of the test data was 6.23\(^\circ \) and 4.75\(^\circ \).

Keywords

Wearable device Human computer interaction Pose estimation 

Notes

Acknowledgements

This work was supported by JST AIP-PRISM Grant Number JPMJCR18Y2, Grant-in-Aid for JSPS Research Fellow Grant Number JP17J05489.

References

  1. 1.
    Chan, L., Chen, Y.L., Hsieh, C.H., Liang, R.H., Chen, B.Y.: CyclopsRing: enabling whole-hand and context-aware interactions through a fisheye ring. In: Proceedings of UIST, pp. 549–556 (2015)Google Scholar
  2. 2.
    Fukui, R., Watanabe, M., Shimosaka, M., Sato, T.: Hand shape classification with a wrist contour sensor. In: Proceedings of Experimental Robotics, pp. 939–949 (2013)CrossRefGoogle Scholar
  3. 3.
    Kashiwagi, N., Sugiura, Y., Miyata, N., Tada, M., Sugimoto, M., Saito, H.: Measuring grasp posture using an embedded camera. In: Proceedings of WACVW, pp. 42–47 (2017)Google Scholar
  4. 4.
    Kim, D., et al.: Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In: Proceedings of UIST, pp. 167–176 (2012)Google Scholar
  5. 5.
    Miyata, N., Honoki, T., Maeda, Y., Endo, Y., Tada, M., Sugiura, Y.: Wrap & sense: grasp capture by a band sensor. In: Proceedings of UIST, pp. 87–89 (2016)Google Scholar
  6. 6.
    Mueller, F., Mehta, D., Sotnychenko, O., Sridhar, S., Casas, D., Theobalt, C.: Real-time hand tracking under occlusion from an egocentric RGB-D sensor. In: Proceedings of ICCVW, pp. 1284–1293 (2017)Google Scholar
  7. 7.
    Rekimoto, J.: Gesturewrist and gesturepad: unobtrusive wearable interaction devices. In: Proceedings of ISWC, pp. 21–27 (2001)Google Scholar
  8. 8.
    Simon, T., Joo, H., Matthews, I., Sheikh, Y.: Hand keypoint detection in single images using multiview bootstrapping. In: Proceedings of CVPR, pp. 1145–1153 (2017)Google Scholar
  9. 9.
    Sridhar, S., Mueller, F., Oulasvirta, A., Theobalt, C.: Fast and robust hand tracking using detection-guided optimization. In: Proceedings of CVPR, pp. 3213–3221 (2015)Google Scholar
  10. 10.
    Vardy, A., Robinson, J., Cheng, L.T.: The WristCam as input device. In: Proceedings of ISWC (1999)Google Scholar
  11. 11.
    Zimmermann, C., Brox, T.: Learning to estimate 3D hand pose from single RGB images. In: Proceedings of ICCV, pp. 4903–4911 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Naoto Ienaga
    • 1
    Email author
  • Wataru Kawai
    • 2
  • Koji Fujita
    • 3
  • Natsuki Miyata
    • 4
  • Yuta Sugiura
    • 1
  • Hideo Saito
    • 1
  1. 1.Keio UniversityYokohamaJapan
  2. 2.The University of TokyoTokyoJapan
  3. 3.Tokyo Medical and Dental UniversityTokyoJapan
  4. 4.National Institute of Advanced Industrial Science and TechnologyTokyoJapan

Personalised recommendations