Advertisement

Hover Detection Using Active Acoustic Sensing

  • Masaya TsurutaEmail author
  • Shuhei Aoyama
  • Arika Yoshida
  • Buntarou Shizuki
  • Jiro Tanaka
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9732)

Abstract

In this paper, we present a technique for hover and touch detection using Active Acoustic Sensing. This sensing technique analyzes the resonant property of the target object and the air around it. To verify whether the detection technique works, we conduct an experiment to discriminate between hovering the hand over the piezoelectric elements placed on a target object and touching the same object. As a result of our experiment, hovering was detected with 96.7 % accuracy and touching was detected with 100 % accuracy.

Keywords

Prototyping Everyday surfaces Touch activities Acoustic classification Machine learning Frequency analysis Ultrasonic Piezoelectric sensor Proximity sensing 

References

  1. 1.
    Ono, M., Shizuki, B., Tanaka, J.: Touch & activate: adding interactivity to existing objects using active acoustic sensing. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, UIST 2013, pp. 31–40. ACM, New York (2013)Google Scholar
  2. 2.
    Murray-Smith, R., Williamson, J., Hughes, S., Quaade, T.: Stane: synthesized surfaces for tactile input. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2008, pp. 1299–1302. ACM, New York (2008)Google Scholar
  3. 3.
    Xiao, R., Lew, G., Marsanico, J., Hariharan, D., Hudson, S., Harrison, C.: Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation. In: Proceedings of the 16th International Conference on Human-computer Interaction with Mobile Devices & Services, MobileHCI 2014, pp. 67–76. ACM, New York (2014)Google Scholar
  4. 4.
    Braun, A., Krepp, S., Kuijper, A.: Acoustic tracking of hand activities on surfaces. In: Proceedings of the 2nd International Workshop on Sensor-Based Activity Recognition and Interaction, WOAR 2015, pp. 9:1–9:5. ACM, New York (2015)Google Scholar
  5. 5.
    Laput, G., Brockmeyer, E., Hudson, S.E., Harrison, C.: Acoustruments: passive, acoustically-driven, interactive controls for handheld devices. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 2161–2170. ACM, New York (2015)Google Scholar
  6. 6.
    Gupta, S., Morris, D., Patel, S., Tan, D.: SoundWave: using the doppler effect to sense gestures. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2012, pp. 1911–1914. ACM, New York (2012)Google Scholar
  7. 7.
    Tung, Y.C., Shin, K.G.: EchoTag: accurate infrastructure-free indoor location tagging with smartphones. In: Proceedings of the 21st Annual International Conference on Mobile Computing and Networking, MobiCom 2015, pp. 525–536. ACM, New York (2015)Google Scholar
  8. 8.
    Harrison, C., Schwarz, J., Hudson, S.E.: TapSense: enhancing finger interaction on touch surfaces. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST 2011, pp. 627–636. ACM, New York (2011)Google Scholar
  9. 9.
    Wang, J., Zhao, K., Zhang, X., Peng, C.: Ubiquitous keyboard for small mobile devices: harnessing multipath fading for fine-grained keystroke localization. In: Proceedings of the 12th Annual International Conference on Mobile Systems, Applications, and Services, MobiSys 2014, pp. 14–27. ACM, New York (2014)Google Scholar
  10. 10.
    Wilson, A.D.: Using a depth camera as a touch sensor. In: ACM International Conference on Interactive Tabletops and Surfaces, ITS 2010, pp. 69–72. ACM, New York (2010)Google Scholar
  11. 11.
    Withana, A., Peiris, R., Samarasekara, N., Nanayakkara, S.: zSense: enabling shallow depth gesture recognition for greater input expressivity on smart wearables. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 3661–3670. ACM, New York (2015)Google Scholar
  12. 12.
    Rekimoto, J.: SmartSkin: an infrastructure for freehand manipulation on interactive surfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2002, pp. 113–120. ACM, New York (2002)Google Scholar
  13. 13.
    Fukatsu, Y., Hiroyuki, H., Shizuki, B., Tanaka, J.: Back-of-device interaction using halls on mobile devices. In: Proceedings of Interaction 2015, Information Processing Society of Japan, pp. 412–415 (2015). (In Japanese)Google Scholar
  14. 14.
    Kratz, S., Rohs, M.: Hoverflow: expanding the design space of around-device interaction. In: Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI 2009, pp. 4:1–4:8. ACM, New York (2009)Google Scholar
  15. 15.
    Hakoda, H., Kuribara, T., Shima, K., Shizuki, B., Tanaka, J.: AirFlip: a double crossing in-air gesture using boundary surfaces of hover zone for mobile devices. In: Kurosu, M. (ed.) HCI 2015. LNCS, vol. 9170, pp. 44–53. Springer, Heidelberg (2015)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Masaya Tsuruta
    • 1
    Email author
  • Shuhei Aoyama
    • 1
  • Arika Yoshida
    • 1
  • Buntarou Shizuki
    • 1
  • Jiro Tanaka
    • 1
  1. 1.University of TsukubaTsukubaJapan

Personalised recommendations