Advertisement

Proposal of Interaction Using Breath on Tablet Device

  • Makoto OkaEmail author
  • Hirohiko Mori
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10904)

Abstract

We would like to propose an interaction which is operated by blowing a breath on a screen of information terminal. Therefore, we propose and evaluate a device for detecting breath and an algorithm for identifying the breath. While it has been studied conventionally about expiration input device operated by a breath, users are not supposed to blow a breath on a touch panel like an ordinary manual operation on the touch panel but required to blow a breath toward a dedicated input sensor. In our proposed system, it has become possible for a user to perform operations such as selection and determination of objects displayed on a screen by detecting a breath blown out toward a screen of information terminal. In this study, a breath interaction is proposed by allocating various breaths to various operations of a tablet terminal.

Keywords

Tablet device Breath Interaction User interface 

References

  1. 1.
    Zarek, A., Wigdor, D., Singh, K.: SNOUT: one-handed use of capacitive touch devices. In: AVI 2012 Proceedings of the International Working Conference on Advanced Visual Interfaces, pp. 140–147 (2012)Google Scholar
  2. 2.
    Constantin, C.I., MacKenzie, I.S.: Tilt-controlled mobile games: velocity-control vs. position-control. In: Games Media Entertainment (GEM) (2014)Google Scholar
  3. 3.
    Bartlett, J.F.: Rock ‘n’ scroll is here to stay [user interface]. IEEE Comput. Graph. Appl. Issue 3, 40–45 (2000)CrossRefGoogle Scholar
  4. 4.
    Hudson, S.E., Harrison, C., Harrison, B.L., LaMarca, A.: Whack gestures: inexact and inattentive interaction with mobile devices. In: TEI 2010 Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 109–112 (2010)Google Scholar
  5. 5.
    Roudaut, A., Baglioni, M., Lecolinet, E.: TimeTilt: using sensor-based gestures to travel through multiple applications on a mobile device. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5726, pp. 830–834. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-642-03655-2_90CrossRefGoogle Scholar
  6. 6.
    Kitayama, I., Nakagawa, H.: Development of the human interface device using flow rate sensor (breath mouse). In: Memoirs of the Faculty of Biology-Oriented Science and Technology of Kinki University, vol. 30, pp. 17–27 (2012). (in Japanese)Google Scholar
  7. 7.
    Kume, Y., Dong, X.: A feasibility study of input interface by expiratory flow pressure. Inst. Image Inf. Telev. Eng. 40(9), 29–32 (2016). (in Japanese)Google Scholar
  8. 8.
    Iga, S., Itoh, E., Yasumura, M.: Kirifuki: breathe in/out user interface for manipulating GUI. In: SIGCHI in Information Processing Society of Japan, vol. 2000, no. 12, 49–54 (2000). (in Japanese)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Tokyo City UniversityTokyoJapan

Personalised recommendations