Embedded Smarthouse Bathroom Entertainment Systems for Improving Quality of Life

  • Shigeyuki Hirai
Part of the Human–Computer Interaction Series book series (HCIS)


The phrase “Smarthouse to improve the smartness of a human’s daily life” has two meanings. One is to improve individual smartness, which represents the Quality of Life (QoL); the other is to improve social smartness, which includes human communications and social consumptions. This chapter primarily describes entertainment systems that can be embedded particularly in the bathrooms of smarthouses and used by humans in everyday life to improve QoL. The systems include “Bathonify,” a sonification system that reflects the bathing states and vital signs of the bather; “TubTouch,” a bathtub entertainment system that uses embedded touch sensors and a projector to control various equipment and systems; and “Bathcratch,” a DJ scratching music system that is operated by rubbing and touching the bathtub. Even though these systems are based on Japanese bathing culture and style, they provide advances in the pleasures of everyday life. In addition, these embedded systems and their techniques provide advances in computer entertainment platforms that can be extended to various places and situations.


Smarthouse Bathroom Embedded sensors Interactive sonification Media arts 



These research were partially supported by Osaka Gas Co.,Ltd and a grant from the Hayao Nakayama Foundation.


  1. Andersen, T. H. (2003). Mixxx: Towards novel DJ interfaces. Proceedings NIME03, pp. 30–35.Google Scholar
  2. Beamish, T., Maclean, K., & Fels, S. (2003). Manipulating Music: Multimodal interaction for DJs. Proceedings CHI, 327–334.Google Scholar
  3. Benko, H., & Wilson, A. D.. (2010). Multi-point interactions with immersive omnidirectional visualizations in a dome. Proceedings of ACM international conference on interactive tabletops and surfaces, pp. 19–28.Google Scholar
  4. Benko, H., Wilson, A. D., & Balakrishnan, R. (2008). Sphere: Multi-touch interactions on a spherical display. Proceedings of UIST2008, pp. 77–86.Google Scholar
  5. Dahley, A., Wisneski, C., & Ishii, H. (1998). Water lamp and pinwheels: Ambient projection of digital information into architectural space. Proceedings of CHI '98, pp. 269–270.Google Scholar
  6. Dietz, P., & Leigh, D. (2001). DiamondTouch: A multiuser touch technology. Proceedings of the 14th annual ACM symposium on user interface software and technology, pp. 219–226.Google Scholar
  7. Fukuchi, K. (2007). Multi-track scratch player on a multi-touch sensing device. Proceedings ICEC (LNCS 4740), pp. 211–218.Google Scholar
  8. Fukuchi, K., & Rekimoto, J. (2002). Interaction techniques for smartSkin. Proceedings of UIST2002.Google Scholar
  9. Hansen, K. F.. (2010). The acoustics and performance of DJ scratching, Analysis and modeling. Doctral Thesis, KTH, Stockholm, Sweden.Google Scholar
  10. Hansen, K. F., & Alonso, M. (2008). More DJ techniques on the reactable. Proceedings of 8th international conference on new interfaces for musical expression, pp. 207–210.Google Scholar
  11. Harrison, C., & Hudson, S. E. (2008). Scratch input: Creating large, inexpensive, unpowered and mobile finger input surfaces. Proceedings UIST'08, pp. 205–208.Google Scholar
  12. Hansen, K. F., Alonso, M., & Dimitrov, S. (2007). Combining DJ scratching, tangible interfaces and a physics-based model of friction sounds. Proceedings of the international computer music conference, pp. 45–48.Google Scholar
  13. Harrison, C., Tan, D., & Morris, D. (2010). Skinput: Appropriating the body as an input surface. Proceedings CHI, pp. 453–462.Google Scholar
  14. Hayafuchi, K., & Suzuki, K. (2008). MusicGlove: A wearable musical controller for massive media library. Proceedings of 8th International conference on new interfaces for musical expression.Google Scholar
  15. Hirai, S., & Ueda, H. (2011). Towards a user-experience research in a living laboratory? Home (KSU-iHome). Proceedings of SI2011. (In Japanese).Google Scholar
  16. Hirai, S., Fujii, G., Sakonda, N., & Inokuchi, S. (2004). Bathroom toward a new amenity space: Bath system representing bathing states by sounds. Journal of Human Interface Society, 6(3), 287–294. (In Japanese).Google Scholar
  17. Hirai, S., Sakakibara, Y., & Hayakawa, S.. (2012). Bathcratch: Touch and sound-based DJ controller implemented on a bathtub. Proceedings of ACE 2012, pp. 44–56.Google Scholar
  18. Hirai, S., Sakakibara, Y., & Hayahshi, H.. (2013). Enabling interactive bathroom entertainment using embedded touch sensors in the bathtub. Proceedings of ACE 2013, pp. 544–547.Google Scholar
  19. Intille, S. S., Larson, K., Beaudin, J., Munguia, T. E., Kaushik, P., Nawyn, J., McLeish, T. J. (2005). The placelab: A live-in laboratory for pervasive computing research (Video). Proceedings of Pervasive 2005 Video Program.Google Scholar
  20. Ishii, H., & Ulmer, B. (1997). Tangible bits: Towards seamless interfaces between people, bits and atoms. Proceedings of CHI '97, pp. 234–241.Google Scholar
  21. Kaltenbrunner, M. (2009). reacTIVision and TUIO: A tangible tabletop toolkit. Proceedings of ITS2009, pp. 9–16.Google Scholar
  22. Kidd, C. D., Orr, R. J., Abowd, G. D., Atkeson, C. G., Essa, I. A., MacIntyre, B., Mynatt, E., Starner T. E., & Newstetter, W. (1999). Proceedings of the second international workshop on cooperative buildings-cobuild'99.Google Scholar
  23. Koike, H., Matoba, Y., & Takahashi, Y. (2012). AquaTop display: Interactive water surface for viewing and manipulating information in a bathroom. Proceedings of ITS 2012, pp. 155–164.Google Scholar
  24. Lopes, P., Jota, R., & Jorge, J. A. (2011). Augmenting touch interaction through acoustic sensing. Proceedings ITS'11, pp. 53–56.Google Scholar
  25. Mason, R., Jennings, L., & Evans, R. (1983). XANADU: The computerized home of tomorrow and how it can be yours today! Washington, D.C.: Acropolis Books.Google Scholar
  26. Moroi, S. (2004). Sound flakes. Proceedings of SIGGRAPH 2004 Emerging Technologies, pp. 25.Google Scholar
  27. Murray-Smith, R., Williamson, J., Hughes, S., & Quaade, T. (2008). Stane: Synthesized surfaces for tactile input. Proceedings CHI, 1299–1302.Google Scholar
  28. Mynatt, E. D., Back, M., Want, R., Baer, M., & Ellis, J. B. (1998). Designing audio aura. Proceedings of the SIGCHI conference on human factors in computing systems, pp. 566–573.Google Scholar
  29. Oki, M., Tsukada, K., & Kurihara, K. & Siio, I. (2008). HomeOrgel: Interactive music box for aural representation. Adjunct Proceedings of Ubicomp2008, pp. 45–46.Google Scholar
  30. Rekimoto, J. (2002). SmartSkin: An infrastructure for freehand manipulation on interactive surfaces. Proceedings of the SIGCHI conference on human factors in computing systems, pp. 113–120.Google Scholar
  31. Ruyter, B. de, Aarts, E., Markopoulos, P., & Ijsselsteijn, W. (2005). Ambient intelligence research in homelab: Engineering the user experience, Ambient Intelligence (pp. 49–61). Berlin: Springer.Google Scholar
  32. Sakakibara, Y., Hayashi, H., & Hirai, S. (2013). Tubtouch: Bathtub touch user-interface toward curved surfaces and unaffected by water. Journal of Information Processing Society of Japan, 54(4), 1538–1550. (In Japanese).Google Scholar
  33. Sato, M., Poupyrev, I., & Harrison, C. (2012). Touché: Enhancing touch interaction on humans, screens, liquids, and everyday objects. Proceedings of CHI2012, pp. 483–492.Google Scholar
  34. Schafer, R. M. (1993). The soundscape: Our sonic environment and the tuning of the world. Vermont: Destiny Books.Google Scholar
  35. Siio, I., Motooka, N., Tsukada, K., & Kanbara, K., Ohta. Y. (2010). Ocha house and ubiquitous computing. Journal of Human Interface, 12(1), 7–12. (In Japanese).Google Scholar
  36. Slayden, A., Spasojevic, M., Hans, M., & Smith, M. (2005). The DJammer: “Air-Scratching” and freeing the DJ to join the party. CHI 2005 Extended Abstracts, pp. 1789–1792.Google Scholar
  37. Sugihara, Y., & Tachi, S. (2000). Water dome-an augmented environment. Proceedings of international conference on computer visualisation, pp. 548–553.Google Scholar
  38. Sugihara, S., & Tachi, S. (2001). Development of head-mounted water display. Journal of The Virtual Reality Society of Japan, 6(2), 145–152. (In Japanese).Google Scholar
  39. Tomibayashi, Y., Takegawa, Y., Terada, T., & Tsukamoto, M. (2006). Wearable DJ system: A new motion-controlled DJ system. Proceedings of ACE '09, pp. 132–139.Google Scholar
  40. Tran, Q. T., & Mynatt, E. D.. (2000) Music monitor: Ambient musical data for the home. Proceedings of the IFIP WG 9.3 international conference on home oriented informatics and telematics.Google Scholar
  41. Ueda, H., & Yamazaki, T. (2007). Ubiquitous home: A study of an intelligent living environment for the daily life support. The Jorunal of Robotics Society of Japan, 25, 10–16. (In Japanese).CrossRefGoogle Scholar
  42. Weiser, M. (1991). The computer for the 21st century. Scientific American special issue on communications, computers, and networks.
  43. Westerman, W. (1999). Hand tracking, finger identification and chordic manipulation on a multiTouch surface. PhD Thesis, University of Delaware.Google Scholar
  44. Yonezawa, T., & Mase, K. (2000). Interaction of musical instrument using fluid media. Journal of The Virtual Reality Society of Japan, 5(1), 755–762.Google Scholar

Copyright information

© Springer-Verlag London 2014

Authors and Affiliations

  1. 1.Faculty of Computer Science and Engineering Kyoto Sangyo UniversityKyotoJapan

Personalised recommendations