Advertisement

NASR: NonAuditory Speech Recognition with Motion Sensors in Head-Mounted Displays

  • Jiaxi GuEmail author
  • Kele Shen
  • Jiliang Wang
  • Zhiwen Yu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10874)

Abstract

With the growing popularity of Virtual Reality (VR), people spend more and more time wearing Head-Mounted Display (HMD) for an immersive experience. HMD is physically attached on wearer’s head so that head motion can be tracked. We find it can also detect subtle movement of facial muscles which is strongly related to speech according to the mechanism of phonation. Inspired by this observation, we propose NonAuditory Speech Recognition (NASR). It uses motion sensor for recognizing spoken words. Different from most prior work of speech recognition using microphone to capture audio signal for analysis, NASR is resistant to acoustic noise of surroundings because of its nonauditory mechanism. Without using microphone, it consumes less power and requires no special permissions in most operating systems. Besides, NASR can be seamlessly integrated into existing speech recognition systems. Through extensive experiments, NASR can get up to 90.97% precision with 82.98% recall rate for speech recognition.

Keywords

Head-Mounted Display Motion sensor Speech recognition Machine learning 

References

  1. 1.
    Abbs, J.H., Gracco, V.L., Blair, C.: Functional muscle partitioning during voluntary movement: facial muscle activity for speech. Exp. Neurol. 85(3), 469–479 (1984)CrossRefGoogle Scholar
  2. 2.
    Balakrishnan, G., Durand, F., Guttag, J.: Detecting pulse from head motions in video. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 3430–3437 (2013)Google Scholar
  3. 3.
    Hernandez, J., Li, Y., Rehg, J.M., Picard, R.W.: Bioglass: physiological parameter estimation using a head-mounted wearable device. In: International Conference on Wireless Mobile Communication and Healthcare, pp. 55–58 (2014)Google Scholar
  4. 4.
    Hernandez, J., McDuff, D.J., Picard, R.W.: Biophone: physiology monitoring from peripheral smartphone motions. In: International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 7180–7183 (2015)Google Scholar
  5. 5.
    Hernandez, J., McDuff, D., Picard, R.W.: Biowatch: estimation of heart and breathing rates from wrist motions. In: Proceedings of the 9th International Conference on Pervasive Computing Technologies for Healthcare, pp. 169–176 (2015)Google Scholar
  6. 6.
    Kwon, S., Lee, J., Chung, G.S., Park, K.S.: Validation of heart rate extraction through an iPhone accelerometer. In: International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5260–5263 (2011)Google Scholar
  7. 7.
    Mohamed, R., Youssef, M.: Heartsense: ubiquitous accurate multi-modal fusion-based heart rate estimation using smartphones. Proc. ACM Interact. Mob. Wearable Ubiquit. Technol. 1(3), 97 (2017)CrossRefGoogle Scholar
  8. 8.
    Wu, C., Yang, Z., Zhou, Z., Liu, X., Liu, Y., Cao, J.: Non-invasive detection of moving and stationary human with WiFi. IEEE J. Sel. Areas Commun. 33(11), 2329–2342 (2015)CrossRefGoogle Scholar
  9. 9.
    Zhou, Z., Shangguan, L., Zheng, X., Yang, L., Liu, Y.: Design and implementation of an RFID-based customer shopping behavior mining system. IEEE/ACM Trans. Netw. 25(4), 2405–2418 (2017)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.School of Computer Science and EngineeringNorthwestern Polytechnical UniversityXi’anPeople’s Republic of China
  2. 2.School of SoftwareTsinghua UniversityBeijingPeople’s Republic of China

Personalised recommendations