A Balloon-Shaped Interface Recognizing Social Touch Interactions
  • Kosuke Nakajima
  • Yuichi Itoh
  • Yusuke Hayashi
  • Kazuaki Ikeda
  • Kazuyuki Fujita
  • Takao Onoye
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8253)


People often communicate with others using social touch interactions including hugging, rubbing, and punching. We propose a soft social-touchable interface called “Emoballoon” that can recognize the types of social touch interactions. The proposed interface consists of a balloon and some sensors including a barometric pressure sensor inside of a balloon, and has a soft surface and ability to detect the force of the touch input. We construct the prototype of Emoballoon using a simple configuration based on the features of a balloon, and evaluate the implemented prototype. The evaluation indicates that our implementation can distinguish seven types of touch interactions with 83.5% accuracy. Finally, we discuss possibilities and future applications of the balloon-made interface.


Soft interface social touch interaction gesture recognition 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Brave, S., Dahley, A.: inTouch: a medium for haptic interpersonal communication. Extended abstracts on CHI 1997, pp. 363–364 (1997)Google Scholar
  2. 2.
    Chang, C., Lin, C.: LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 27:1–27:27 (2011)CrossRefGoogle Scholar
  3. 3.
    Fitzmaurice, G.W., Buxton, W.: An empirical evaluation of graspable user interfaces: towards specialized, space-multiplexed input. In: Proc. CHI 1997, pp. 43–50 (1997)Google Scholar
  4. 4.
    Fitzmaurice, G.W., Ishii, H., Buxton, W.A.S.: Bricks: laying the foundations for graspable user interfaces. In: Proc. CHI 1995, pp. 442–449 (1995)Google Scholar
  5. 5.
    Harrison, C., Hudson, S.E.: Providing dynamically changeable physical buttons on a visual display. In: Proc. CHI 2009, pp. 299–308 (2009)Google Scholar
  6. 6.
    Harrison, C., Hudson, S.E.: Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces. In: Proc. of UIST 2008, pp. 205–208 (2008)Google Scholar
  7. 7.
    Iwata, H., Yano, H., Ono, N.: Volflex. In: ACM SIGGRAPH 2005 Emerging Technologies, Article No. 31 (2005)Google Scholar
  8. 8.
    Kakehi, Y., Jo, K., Sato, K., Minamizawa, K., Nii, H., Kawakami, N., Naemura, T., Tachi, S.: ForceTile: tabletop tangible interface with vision-based force distribution sensing. ACM SIGGRAPH 2008 New Tech. Demos, 17:1–17:1 (2008)Google Scholar
  9. 9.
    Kim, K.-E., Chang, W., Cho, S.-J., Shim, J., Lee, H., Park, J., Lee, Y., Kim, S.: Hand grip pattern recognition for mobile user interfaces. In: Proc. IAAI 2006, vol. 2, pp. 1789–1794 (2006)Google Scholar
  10. 10.
    Kim, S., Kim, H., Lee, B., Nam, T.-J., Lee, W.: Inflatable mouse: volume-adjustable mouse with air-pressure-sensitive input and haptic feedback. In: Proc. CHI 2008, pp. 211–224 (2008)Google Scholar
  11. 11.
    Kitamura, Y., Sakurai, S., Yamaguchi, T., Fukazawa, R., Itoh, Y., Kishino, F.: Multi-modal Interface in Multi-Display Environment for Multi-users. In: Jacko, J.A. (ed.) Human-Computer Interaction, Part II, HCII 2009. LNCS, vol. 5611, pp. 66–74. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  12. 12.
    Knight, H., Toscano, R., Stiehl, W.D., Chang, A., Wang, Y., Breazeal, C.: Real-time social touch gesture recognition for sensate robots. In: Proc. IROS 2009, pp. 3715–3720 (2009)Google Scholar
  13. 13.
    Lopes, P., Jota, R., Jorge, J.A.: Augmenting touch interaction through acoustic sensing. In: Proc. ITS 2011, pp. 53–56 (2011)Google Scholar
  14. 14.
    Mitsunaga, N., Miyashita, T., Yoshikawa, Y., Ishiguro, H., Kogure, K., Hagita, N.: Robovie-IV: a robot enhances co-experience. In: Proc. the Workshop on Ubiquitous Experience Media at ISWC 2005, pp. 17–23 (2005)Google Scholar
  15. 15.
    Pai, D., VanDerLoo, E., Sadhukhan, S., Kry, P.: The Tango: a tangible tangoreceptive whole-hand human interface. In: Proc. World Haptics (Joint Eurohaptics Conference and IEEE Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems), pp. 141–147 (2005)Google Scholar
  16. 16.
    Paradiso, J.A.: The interactive balloon: Sensing, actuation and behavior in a common object. IBM Systems Journal 4, 473–487 (1996)CrossRefGoogle Scholar
  17. 17.
    Sato, T., Mamiya, H., Koike, H., Fukuchi, K.: PhotoelasticTouch: transparent rubbery tangible interface using an LCD and photoelasticity. In: Proc. UIST 2009, pp. 43–50 (2009)Google Scholar
  18. 18.
    Stevenson, A., Perez, C., Vertegaal, R.: An inflatable hemispherical multi-touch display. In: Proc. TEI 2011, pp. 289–292 (2011)Google Scholar
  19. 19.
    Stiehl, W.D., Breazeal, C.: Affective Touch for Robotic Companions. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 747–754. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  20. 20.
    Sugiura, Y., Kakehi, G., Withana, A., Lee, C., Sakamoto, D., Sugimoto, M., Inami, M., Igarashi, T.: Detecting shape deformation of soft objects using directional photoreflectivity measurement. In: Proc. UIST 2011, pp. 509–516 (2011)Google Scholar
  21. 21.
    Taylor, B.T., Bove Jr., V.M.: Graspables: grasp-recognition as a user interface. In: Proc. CHI 2009, pp. 917–926 (2009)Google Scholar
  22. 22.
    Tobita, H., Maruyama, S., Kuji, T.: Floating avatar: blimp-based telepresence system for communication and entertainment. ACM SIGGRAPH 2011 Emerging Technologies, 4:1–4:1 (2011)Google Scholar
  23. 23.
    Wimmer, R., Boring, S.: HandSense: discriminating different ways of grasping and holding a tangible user interface. In: Proc. TEI 2009, pp. 359–362 (2009)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  • Kosuke Nakajima
    • 1
  • Yuichi Itoh
    • 1
  • Yusuke Hayashi
    • 1
  • Kazuaki Ikeda
    • 1
  • Kazuyuki Fujita
    • 1
  • Takao Onoye
    • 1
  1. 1.SuitaJapan

Personalised recommendations