Advertisement

The Pulse Breath Water System: Exploring Breathing as an Embodied Interaction for Enhancing the Affective Potential of Virtual Reality

  • Mirjana Prpa
  • Kıvanç Tatar
  • Bernhard E. Riecke
  • Philippe Pasquier
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10280)

Abstract

We introduce Pulse Breath Water, an immersive virtual environment (VE) with affect estimation in sound. We employ embodied interaction between a user and the system through the user’s breathing frequencies mapped to the system’s behaviour. In this study we investigate how two different mappings (metaphoric, and “reverse”) of embodied interaction design might enhance the affective properties of the presented system. We build on previous work in embodied cognition, embodied interaction, and affect estimation in sound by examining the impact of affective audiovisuals and two kinds of interaction mapping on the user’s engagement, affective states, and overall experience. The insights gained through questionnaires and semi-structured interviews are discussed in the context of participants’ lived experience and the limitations of the system to be addressed in future work.

Keywords

Virtual Reality Virtual Environment Affective Property Audio Sample Immersive Virtual Environment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

We thank all the study participants for their involvement, and the MovingStories SSHRC research project for their support while working on this piece.

References

  1. 1.
    Thought Technology Ltd.: ProComp2 - 2 Channel Biofeedback & Neurofeedback System w/BioGraph Infiniti Software Thought Technology Ltd., Apirl 2015. http://thoughttechnology.com/index.php/procomp2-2-channel-biofeedback-neurofeedback-system-w-biograph-infiniti-software.html
  2. 2.
    Arroyo-Palacios, J., and Romano, D.M.: Exploring the use of a respiratory-computer interface for game interaction. In: 2009 International IEEE Consumer Electronics Society’s Games Innovations Conference, pp. 154–159, August 2009Google Scholar
  3. 3.
    Asutay, E., Västfjäll, D.: Perception of loudness is influenced by emotion. PLoS ONE 7, 6 (2012)CrossRefGoogle Scholar
  4. 4.
    Beckhaus, S., Kruijff, E.: Unconventional human computer interfaces. In: ACM SIGGRAPH 2004 Course Notes. ACM, New York (2004)Google Scholar
  5. 5.
    Berglund, B., Nilsson, M.E., Axelsson, Ö.: Soundscape psychophysics in place. IN07-114Google Scholar
  6. 6.
    Bradley, M.M., Lang, P.J.: Affective reactions to acoustic stimuli. Psychophysiology 37(2), 204–215 (2000)CrossRefGoogle Scholar
  7. 7.
    Brockmyer, J.H., Fox, C.M., Curtiss, K.A., McBroom, E., Burkhart, K.M., Pidruzny, J.N.: The development of the game engagement questionnaire: a measure of engagement in video game-playing. J. Exp. Soc. Psychol. 45(4), 624–634 (2009)CrossRefGoogle Scholar
  8. 8.
    Davies, C.: OSMOSE: notes on being in Immersive virtual space. Digital Creativity 9(2), 65–74 (1998)CrossRefGoogle Scholar
  9. 9.
    Dourish, P.: Where the Action Is. MIT Press, Cambridge (2004)Google Scholar
  10. 10.
    Eerola, T., Vuoskoski, J.K.: A review of music and emotion studies: approaches, emotion models, and stimuli. Music Percept. Interdisc. J. 30(3), 307–340 (2013)CrossRefGoogle Scholar
  11. 11.
    Ekkekakis, P.: Should affective states be considered as distinct entities or as positioned along dimensions? In: The Measurement of Affect, Mood, and Emotion: A Guide for Health-Behavioral Research, pp. 52–72. Cambridge University Press, February 2013Google Scholar
  12. 12.
    England, D., Randles, M., Fergus, P., Taleb-Bendiab, A.: Towards an advanced framework for whole body interaction. In: Shumaker, R. (ed.) VMR 2009. LNCS, vol. 5622, pp. 32–40. Springer, Heidelberg (2009). doi: 10.1007/978-3-642-02771-0_4 CrossRefGoogle Scholar
  13. 13.
    Evreinov, G., Evreinova, T.: “Breath-Joystick”-graphical manipulator for physically disabled users. In: Proceedings of the ICCHP 2000, pp. 193–200 (2000)Google Scholar
  14. 14.
    Fan, J., Thorogood, M., Pasquier, P.: Automatic soundscape affect recognition using a dimensional approach. J. Audio Eng. Soc. 64(9), 646–653 (2016)CrossRefGoogle Scholar
  15. 15.
    Flach, J.M., Holden, J.G.: The reality of experience: Gibson’s way. Presence: Teleoperators Virtual Environ. 7(1), 90–95 (1998)CrossRefGoogle Scholar
  16. 16.
    Gaver, W.W., Beaver, J., Benford, S.: Ambiguity as a resource for design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2003, pp. 233–240. ACM, New York (2003)Google Scholar
  17. 17.
    Kandel, E.: Reductionism in Art and Brain Science: Bridging the Two Cultures. Columbia University Press, New York (2016)Google Scholar
  18. 18.
    Kirsh, D.: Embodied cognition and the magical future of interaction design. ACM Trans. Comput.-Hum. Interact 20(1), 3:1–3:30 (2013)CrossRefGoogle Scholar
  19. 19.
    Kruijff, E.: Unconventional 3D user interfaces for virtual environments. Doctoral Dissertation, October 2006Google Scholar
  20. 20.
    Kuzume, K.: Input device for disabled persons using expiration and tooth-touch sound signals. In: Proceedings of the 2010 ACM Symposium on Applied Computing, pp. 1159–1164. ACM (2010)Google Scholar
  21. 21.
    Macaranas, A., Antle, A.N., Riecke, B.E.: What is intuitive interaction? balancing users’ performance and satisfaction with natural user interfaces. Interact. Comput. 27(3), 357–370 (2015)CrossRefGoogle Scholar
  22. 22.
    Marteau, T.M., Bekker, H.: The development of a six-item short-form of the state scale of the spielberger state-trait anxiety inventory (STAI). Br. J. Clin. Psychol. 31(3), 301–306 (1992)CrossRefGoogle Scholar
  23. 23.
    Media Art Net: Gabriel, Ulrike: Breath, 1992/93, December 2016Google Scholar
  24. 24.
    Neustaedter, C., Sengers, P.: Autobiographical design in HCI research: designing and learning through use-it-yourself. In: Proceedings of the Designing Interactive Systems Conference, pp. 514–523. ACM (2012)Google Scholar
  25. 25.
    Posner, J., Russell, J.A., Peterson, B.S.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17(3), 715–734 (2005)CrossRefGoogle Scholar
  26. 26.
    Robertson, T.: Cooperative work and lived cognition: a taxonomy of embodied actions. In: Proceedings of the Fifth European Conference on Computer Supported Cooperative Work, pp. 205–220. Springer, Netherlands (1997). doi: 10.1007/978-94-015-7372-6_14
  27. 27.
    Russell, J., Weiss, A., Mendelsohn, G.: Affect grid - a single-item scale of pleasure and arousal. J. Pers. Soc. Psychol. 57(3), 493–502 (1989)CrossRefGoogle Scholar
  28. 28.
    Schiphorst, T.: Breath, skin and clothing: using wearable technologies as an interface into ourselves. Int. J. Perform. Arts Digital Media 2(2), 171–186 (2006)CrossRefGoogle Scholar
  29. 29.
    Shorrock, T.H., MacKay, D.J.C., Ball, C.J.: Efficient communication by breathing. In: Winkler, J., Niranjan, M., Lawrence, N. (eds.) DSMML 2004. LNCS (LNAI), vol. 3635, pp. 88–97. Springer, Heidelberg (2005). doi: 10.1007/11559887_5 CrossRefGoogle Scholar
  30. 30.
    Slater, M.: Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philos. Trans. R. Soc. Lond. B Biol. Sci. 364(1535), 3549–3557 (2009)CrossRefGoogle Scholar
  31. 31.
    Slater, M., Sanchez-Vives, M.V.: Enhancing our lives with immersive virtual reality. Front. Robot. AI 3, 6 (2016)CrossRefGoogle Scholar
  32. 32.
    Sonne, T., Jensen, M.M.: ChillFish: a respiration game for children with ADHD. In: Proceedings of the Tenth International Conference on Tangible, Embedded, and Embodied Interaction, TEI 2016, pp. 271–278. ACM (2016)Google Scholar
  33. 33.
    Tajadura-Jimenez, A., Väljamäe, A., Västfjäll, D.: Self-representation in mediated environments: the experience of emotions modulated by auditory-vibrotactile heartbeat. Cyberpsychol. Behav. Impact Internet Multimedia Virtual Reality Behav. Soc. 11(1), 33–38 (2008)Google Scholar
  34. 34.
    Tennent, P., Rowland, D., Marshall, J., Egglestone, S.R., Harrison, A., Jaime, Z., Walker, B., Benford, S.: Breathalising games: understanding the potential of breath control in game interfaces. In: Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology, p. 58. ACM (2011)Google Scholar
  35. 35.
    Thorogood, M., Pasquier, P.: Impress: a machine learning approach to soundscape affect classification, pp. 256–260Google Scholar
  36. 36.
    Waterworth, E.L., Häggkvist, M., Jalkanen, K., Olsson, S., Waterworth, J.A., Wimelius, H.: The exploratorium: an environment to explore your feelings. PsychNology J. 1(3), 189–201 (2003)Google Scholar
  37. 37.
    Wilson, S.: Information Arts: Intersections of Art, Science, and Technology. Leonardo. MIT Press, Cambridge (2002)Google Scholar
  38. 38.
    Zahorik, P., Jenison, R.L.: Presence as being-in-the-world. Presence 7(1), 78–89 (1998)CrossRefGoogle Scholar
  39. 39.
    Zimmerman, J., Forlizzi, J., Evenson, S.: Research through design as a method for interaction design research in HCI. In: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 493–502. ACM (2007)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Mirjana Prpa
    • 1
  • Kıvanç Tatar
    • 1
  • Bernhard E. Riecke
    • 1
  • Philippe Pasquier
    • 1
  1. 1.School of Interactive Arts and TechnologySimon Fraser UniversitySurreyCanada

Personalised recommendations