Hands-Free Interface Using Breath Residual Heat

  • Kanghoon Lee
  • Sang Hwa Lee
  • Jong-Il ParkEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10904)


Most user interfaces have been studied based on hand gestures or finger touches, but the interface using the user’s hands does not reflect the user’s various situations. In this paper, we propose a hands-free user interaction system using a thermal camera. The hands-free interface proposed in this paper exploits user’s breath heat and thermal camera, thus it is very useful for users who have difficulty in using their hands. In addition, the thermal camera is not affected by background color and lighting environment, so it can be used in various complex situations. For hands-free interaction, the user creates a residual heat on the surface of the object to interact, and the thermal camera senses the residual heat. This paper has observed that the residual heat from breath is most suitable for the interaction design. For this observation, several different methods were tested for how to generate strong residual heat on the various materials. According to the tests, it was verified that the residual heat generated from breath with hollow rod (straw) is most stable for sensing and interaction. This paper demonstrates its usefulness by implementing an interaction system using camera projection system as an application example.


Hands-free Interface Residual heat Breath heat Thermal camera 



This work was supported by Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIT) (No. 2017-0-01849, Development of Core Technology for Real-Time Image Composition in Unstructured In-outdoor Environment)


  1. 1.
    Dietz, P., Leigh, D.: DiamondTouch: a multi-user touch technology. In: Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology, pp. 219–226. ACM (2001)Google Scholar
  2. 2.
    Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., Twombly, X.: Vision-based hand pose estimation: a review. Comput. Vis. Image Underst. 108(1–2), 52–73 (2007)CrossRefGoogle Scholar
  3. 3.
    Fang, Y., Yamada, K., Ninomiya, Y., Horn, B.K., Masaki, I.: A shape-independent method for pedestrian detection with far-infrared images. IEEE Trans. Veh. Technol. 53(6), 1679–1697 (2004)CrossRefGoogle Scholar
  4. 4.
    Gade, R., Moeslund, T.B.: Thermal cameras and applications: a survey. Mach. Vis. Appl. 25(1), 245–262 (2014)CrossRefGoogle Scholar
  5. 5.
    Gowen, A.A., Tiwari, B.K., Cullen, P.J., McDonnell, K., O’Donnell, C.P.: Applications of thermal imaging in food quality and safety assessment. Trends Food Sci. Technol. 21(4), 190–200 (2010)CrossRefGoogle Scholar
  6. 6.
    Hilliges, O., Izadi, S., Wilson, A.D., Hodges, S., Garcia-Mendoza, A., Butz, A.: Interactions in the air: adding further depth to interactive tabletops. In: Proceedings of UIST 2009, pp. 139–148. ACM Press, New York (2009)Google Scholar
  7. 7.
    Jones, B.F., Plassmann, P.: Digital infrared thermal imaging of human skin. IEEE Eng. Med. Biol. Mag. 21(6), 41–48 (2002)CrossRefGoogle Scholar
  8. 8.
    Kane, S.K., Avrahami, D., Wobbrock, J.O., Harrison, B., Rea, A.D., Philipose, M., LaMarca, A.: Bonfire: a nomadic system for hybrid laptop-tabletop interaction. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, pp. 129–138. ACM (2009)Google Scholar
  9. 9.
    Kong, S.G., Heo, J., Boughorbel, F., Zheng, Y., Abidi, B.R., Koschan, A., Abidi, M.A.: Multiscale fusion of visible and thermal IR images for illumination-invariant face recognition. Int. J. Comput. Vis. 71(2), 215–233 (2007)CrossRefGoogle Scholar
  10. 10.
    Kurz, D.: Thermal touch: thermography-enabled everywhere touch interfaces for mobile augmented reality applications. In: Mixed and Augmented Reality (ISMAR) 2014, pp. 9–16. IEEE (2014)Google Scholar
  11. 11.
    Larson, E., Cohn, G., Gupta, S., Ren, X., Harrison, B., Fox, D., Patel, S.: HeatWave: thermal imaging for surface user interaction. In: Proceedings of the 2011 SIGCHI Conference on Human Factors in Computing Systems, pp. 2565–2574. ACM (2011)Google Scholar
  12. 12.
    Lewis, G.F., Gatto, R.G., Porges, S.W.: A novel method for extracting respiration rate and relative tidal volume from infrared thermography. Psychophysiology 48(7), 877–887 (2011)CrossRefGoogle Scholar
  13. 13.
    Oka, K., Sato, Y., Koike, H.: Real-time tracking of multiple fingertips and gesture recognition for augmented desk interface systems. In: Proceedings of 2002 Fifth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 429–434. IEEE (2002)Google Scholar
  14. 14.
    Rekimoto, J.: SmartSkin: an infrastructure for freehand manipulation on interactive surfaces. In: Proceedings of the 2002 SIGCHI Conference on Human Factors in Computing Systems, pp. 113–120. ACM (2002)Google Scholar
  15. 15.
    Ring, E.F.J., Ammer, K.: Infrared thermal imaging in medicine. Physiol. Meas. 33(3), R33 (2012)CrossRefGoogle Scholar
  16. 16.
    Sahami Shirazi, A., Abdelrahman, Y., Henze, N., Schneegass, S., Khalilbeigi, M., Schmidt, A.: Exploiting thermal reflection for interactive systems. In: Proceedings of the 2017 SIGCHI Conference on Human Factors in Computing Systems, pp. 3483–3492. ACM (2017)Google Scholar
  17. 17.
    Wilson, A.D.: PlayAnywhere: a compact interactive tabletop projection-vision system. In: Proceedings of UIST 2005, pp. 83–92. ACM Press, New York (2005)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Hanyang UniversitySeoulKorea
  2. 2.INMCSeoul National UniversitySeoulKorea

Personalised recommendations