SofTouch: Turning Soft Objects into Touch Interfaces Using Detachable Photo Sensor Modules

  • Naomi FuruiEmail author
  • Katsuhiro Suzuki
  • Yuta Sugiura
  • Maki Sugimoto
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10507)


We propose a system that turns everyday soft objects, such as cushions, into touch interfaces in a non-intrusive manner. The belt type sensor modules developed in this study, which can be attached to the outside of soft objects, comprise several photo reflective sensors to measure the reflection intensity from an object by emitting infrared light (IR LED). When the light is irradiated to a material with the same reflection coefficient, a reflection intensity that is inversely proportional to the square of the distance can be obtained. Our method uses the sensor modules to measure the change in distance from the sensor to the surface of the soft object, and the touch position is estimated using a Support Vector Machine (SVM). To evaluate our method, we measured the accuracy when touching nine points on a cushion. Using existing everyday soft objects, it is possible to create interfaces that not only blend into the living space naturally but also match the preferences of each user.


Soft user interface Photo reflective sensor Touch interface 


  1. 1.
    Hiramatsu, R.: Puyo-Con. In: ACM SIGGRAPH ASIA 2009 Art Gallery & Emerging Technologies: Adaptation (SIGGRAPH ASIA 2009), p. 81. ACM, New York (2009)Google Scholar
  2. 2.
    Ikeda, K., Koizumi, N., Naemura, T.: FunCushion: a functional cushion interface that can detect user’s push and display iconic patterns. In: IEICE-MVE2016-42, IEICE Technical report, 116; 412, pp. 347–352 (2017). (in Japanese)Google Scholar
  3. 3.
    Kadowaki, A., Yoshikai, T., Hayashi, M., Inaba, M.: Development of soft sensor exterior embedded with multi-axis deformable tactile sensor system. In: RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, pp. 1093–1098 (2009)Google Scholar
  4. 4.
    Kamiyama, K., Vlack, K., Mizota, T., Kajimoto, H., Kawakami, N., Tachi, S.: Vision-based sensor for real-time measuring of surface traction fields. IEEE Comput. Graph. Appl. 25(1), 68–75 (2005)CrossRefGoogle Scholar
  5. 5.
    Nakamura, H., Miyashita, H.: Control of augmented reality information volume by glabellar fader. In: Proceedings of the 1st Augmented Human International Conference (AH 2010), 3 p. ACM, Article 20, New York (2010)Google Scholar
  6. 6.
    Ogata, M., Sugiura, Y., Makino, Y., Inami, M., Imai, M.: SenSkin: adapting skin as a soft interface. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST 2013), pp. 539–544. ACM, New York (2013)Google Scholar
  7. 7.
    PSVM: Support Vector Machines for Processing.
  8. 8.
    Sato, T., Mamiya, H., Koike, H., Fukuchi, K.: PhotoelasticTouch: transparent rubbery tangible interface using an LCD and photoelasticity. In: Proceedings on the 22nd Annual ACM Symposium on User Interface Software and Technology (UIST 2009), pp. 43–50 (2009)Google Scholar
  9. 9.
    Sugiura, Y., Kakehi, G., Withana, A., Lee, C., Sakamoto, D., Sugimoto, M., Inami, M., Igarashi, T.: Detecting shape deformation of soft objects using directional photoreflectivity measurement. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST 2011), pp. 509–516. ACM, New York (2011)Google Scholar
  10. 10.
    Sugiura, Y., Lee, C., Ogata, M., Withana, A., Makino, Y., Sakamoto, D., Inami, M., Igarashi, T.: PINOKY: a ring that animates your plush toys. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2012), pp. 725–734. ACM, New York (2012)Google Scholar
  11. 11.
    Tsuruoka, H., Koyama, K., Shirakashi, Y., Yairi, I.: Ambient system for encouraging autonomous learning using cushion shaped device. In: The 30th Annual Conference of the Japanese Society for Artificial Intelligence (2016). (in Japanese)Google Scholar
  12. 12.
    Tominaga, Y., Tsukada, K., Siio, I.: FuwaMonyu interface. In: Interaction 2010, pp. 665–668 (2011). (in Japanese)Google Scholar
  13. 13.
    Vanderloock, K., Vanden Abeele, V., Suykens, J.A.K., Geurts, L.: The skweezee system: enabling the design and the programming of squeeze interactions. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST 2013), pp. 521–530. ACM, New York (2013)Google Scholar
  14. 14.
    Yagi, I., Kobayashi, S., Kashiwagi, R., Uriu, D., Okude, N.: Media cushion: soft interface to control living environment using human natural behavior. In: ACM SIGGRAPH 2011 Posters (SIGGRAPH 2011), 1 p. ACM, Article 46, New York (2011)Google Scholar
  15. 15.
    Yonezawa, T., Clarkson, B., Yasumura, M., Mase, K.: Context-aware sensor-doll as a music expression device. In: CHI 2001 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2001), pp. 307–308. ACM, New York (2001)Google Scholar
  16. 16.
    Shimojo, M., Ishikawa, M., Kanayama, K.: A flexible high resolution tactile imager with video signal output. In: Proceedings of the 1991 IEEE International Conference on Robotics and Automation, pp. 384–391 (1991)Google Scholar
  17. 17.
    Tajima, R., Kagami, S., Inaba, M., Inoue, H.: Development of soft and distributed tactile sensors and the application to a humanoid robot. Adv. Robot. 16(4), 381–397 (2002)CrossRefGoogle Scholar
  18. 18.
    Hoshi, T., Shinoda, H.: Tactile sensing using nonlinear elasticity. In: Proceedings of SICE Annual Conference 2005, pp. 2978–2981 (2005)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2017

Authors and Affiliations

  • Naomi Furui
    • 1
    Email author
  • Katsuhiro Suzuki
    • 1
  • Yuta Sugiura
    • 1
  • Maki Sugimoto
    • 1
  1. 1.Keio UniversityYokohamaJapan

Personalised recommendations