Advertisement

Toward Detection of Driver Drowsiness with Commercial Smartwatch and Smartphone

  • Liangliang LinEmail author
  • Hongyu Yang
  • Yang Liu
  • Haoyuan Zheng
  • Jizhong Zhao
Conference paper
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 303)

Abstract

In the life, there are always many objects that are unable to actively contact with us, such as keychains, glasses and mobile phones. In general, they are referred to non-cooperative targets. Non-cooperative targets are often overlooked by users while being hard to find. It will be convenient if we can localize those non-cooperative targets. We propose a non-cooperative target localization system which based on MEMS. We detect the arm posture changes of the user by using the MEMS sensors which embedded in the smart watch. First distinguish the arm motions, identify the final motion, and then perform the localization. There are two essential models in our system. The first step is arm gesture estimation model which based on MESE sensor in smart watch. we first collect the MEMS sensor data from the watch. And then the arm kinematic model and formulate the mathematical relationship between arm degrees of freedom with and the gestures of watch. We compare the results of the four actions which are important in the later model with the Kinect observations. The errors in the space are less than 0.14 m. The second step is non-cooperative target localization model that based on the first step. We use the 5-degrees data of the arm to train the classification model and identify the key actions in the scene. In this step, we estimate the location of non-cooperative targets through the type of interactive actions. To demonstrate the effectiveness of our system, we implement it on tracking keys and mobile phones in practice. The experiments show that the localization accuracy is >83%.

Keywords

Arm gesture Non-cooperative target Localization Smart-watch 

References

  1. 1.
    Nelson, E.C., Verhagen, T., Noordzij, M.L.: Health empowerment through activity trackers: an empirical smart wristband study. Comput. Hum. Behav. 62, 364–374 (2016)CrossRefGoogle Scholar
  2. 2.
    Li, S.Y., et al.: An exploration of fall-down detection by smart wristband. Appl. Mech. Mater. 687–691, 805–808 (2014)Google Scholar
  3. 3.
    Cheng, L., Shum, V., Kuntze, G., et al.: A wearable and flexible Wristband computer for on-body sensing. In: 2011 IEEE Consumer Communications and Networking Conference (CCNC), pp. 860–864. IEEE (2011)Google Scholar
  4. 4.
    Al-Nasser, K.: Smart watch: US, US8725842 (2014)Google Scholar
  5. 5.
    Rauschnabel, P.A., Brem, A., Ivens, B.S.: Who will buy smart glasses? Empirical results of two pre market-entry studies on the role of personality in individual awareness and intended adoption of Google Glass wearables. Comput. Hum. Behav. 49(8), 635–647 (2015)CrossRefGoogle Scholar
  6. 6.
    Ji, S., Xu, W., Yang, M., et al.: 3D convolutional neural networks for human action recognition. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 221 (2013)CrossRefGoogle Scholar
  7. 7.
    Lei, J., Li, G., Zhang, J., Guo, Q., Dan, T.: Continuous action segmentation and recognition using hybrid convolutional neural network-hidden Markov model model. Comput. Vis. IET 10(6), 537–544 (2016)CrossRefGoogle Scholar
  8. 8.
    Martínez, F., Manzanera, A., Romero, E.: Spatio-temporal multi-scale motion descriptor from a spatially-constrained decomposition for online action recognition. Comput. Vis. IET 11(7), 541–549 (2017)CrossRefGoogle Scholar
  9. 9.
    Scovanner, P., Ali, S., Shah, M.: A 3-dimensional sift descriptor and its application to action recognition. In: Proceedings of the 15th ACM International Conference on Multimedia, pp. 357–360 (2007)Google Scholar
  10. 10.
    Blank, M., Gorelick, L., Shechtman, E., Irani, M., Basri, R.: Actions as space-time shapes. In: Proceedings of the Tenth IEEE International Conference on Computer Vision, pp. 1395–1402, 17–20 October 2005 (2005)Google Scholar
  11. 11.
    Csurka, G., et al.: Visual categorization with bags of keypoints. In: ECCV (2004)Google Scholar
  12. 12.
    Ogata, M., Imai, M.: SkinWatch: skin gesture interaction for smart watch. In: Proceedings of the 6th Augmented Human International Conference, pp. 21–24. ACM (2015)Google Scholar
  13. 13.
    Laput, G., Xiao, R., Chen, X.A., Hudson, S.E., Harrison, C.: Skin buttons: cheap, small, low-powered and clickable fixed-icon laser projectors. In: Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, Honolulu, Hawaii, USA, 05–08 October 2014 (2014)Google Scholar
  14. 14.
    Weigel, M., Mehta, V., Steimle, J.: More than touch: understanding how people use skin as an input surface for mobile computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, Ontario, Canada, 26 April–01 May 2014 (2014)Google Scholar
  15. 15.
    Parate, A., Chiu, M.C., Chadowitz, C., et al.: RisQ: recognizing smoking gestures with inertial sensors on a wristband. In: International Conference on Mobile Systems, MobiSys, p. 149 (2014)Google Scholar
  16. 16.
    Xu, C., Pathak, P.H., Mohapatra, P.: Finger-writing with smartwatch: a case for finger and hand gesture recognition using smartwatch. In: International Workshop on Mobile Computing Systems and Applications, pp. 9–14. ACM (2015)Google Scholar
  17. 17.
    Komninos, A., Dunlop, M.: Text input on a smart watch. IEEE Pervasive Comput. 13(4), 50–58 (2014)CrossRefGoogle Scholar

Copyright information

© ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2019

Authors and Affiliations

  • Liangliang Lin
    • 1
    • 2
    Email author
  • Hongyu Yang
    • 1
  • Yang Liu
    • 1
  • Haoyuan Zheng
    • 1
  • Jizhong Zhao
    • 1
  1. 1.School of Computer Science and Technology, Department of TelecommunicationsXi’an Jiaotong UniversityXi’anPeople’s Republic of China
  2. 2.Information DepartmentXi’an Conservatory of MusicXi’anPeople’s Republic of China

Personalised recommendations