Advertisement

Real-Time Emotional State Estimator for Adaptive Virtual Reality Stimulation

  • Davor Kukolja
  • Siniša Popović
  • Branimir Dropuljić
  • Marko Horvat
  • Krešimir Ćosić
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5638)

Abstract

The paper presents design and evaluation of emotional state estimator based on artificial neural networks for physiology-driven adaptive virtual reality (VR) stimulation. Real-time emotional state estimation from physiological signals enables adapting the stimulations to the emotional response of each individual. Estimation is first evaluated on artificial subjects, which are convenient during software development and testing of physiology-driven adaptive VR stimulation. Artificial subjects are implemented in the form of parameterized skin conductance and heart rate generators that respond to emotional inputs. Emotional inputs are a temporal sequence of valence/arousal annotations, which quantitatively express emotion along unpleasant-pleasant and calm-aroused axes. Preliminary evaluation of emotional state estimation is also performed with a limited set of humans. Human physiological signals are acquired during simultaneous presentation of static pictures and sounds from valence/arousal-annotated International Affective Picture System and International Affective Digitized Sounds databases.

Keywords

Real-Time Emotional State Estimator Adaptive Virtual Reality Stimulation Artificial Neural Network Stimuli Generation Physiological Measurements 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ćosić, K., Popović, S., Jovanovic, T., Kukolja, D., Slamić, M.: Physiology-Driven Adaptive VR System: Technology and Rationale for PTSD Treatment. Annual Review of CyberTherapy and Telemedicine 5, 179–191 (2007)Google Scholar
  2. 2.
    Popovic, S., Slamic, M., Cosic, K.: Scenario Self-Adaptation in Virtual Reality Exposure Therapy of Posttraumatic Stress Disorder. In: Roy, M.J. (ed.) Novel Approaches to the Diagnosis and Treatment of Posttraumatic Stress Disorder. NATO Security through Science Series - E: Human and Societal Dynamics, vol. 6, pp. 135–147. IOS Press, Amsterdam (2006)Google Scholar
  3. 3.
    Schuemie, M.J., van der Mast, C.A.P.G., Krijn, M., Emmelkamp, P.M.G.: Exploratory Design and Evaluation of a User Interface for Virtual Reality Exposure Therapy. In: Westwood, J.D., Hoffman, H.M., Robb, R.A., Stredney, D. (eds.) Medicine Meets Virtual Reality 02/10, pp. 468–474. IOS Press, Amsterdam (2002)Google Scholar
  4. 4.
    Gerardi, M., Rothbaum, B.O., Ressler, K., Heekin, M., Rizzo, A.: Virtual Reality Exposure Therapy Using a Virtual Iraq: Case Report. Journal of Traumatic Stress 21(2), 209–213 (2008)CrossRefPubMedPubMedCentralGoogle Scholar
  5. 5.
    Wood, D.P., Murphy, J.A., Center, K.B., Russ, C., McClay, R.N., Reeves, D., Pyne, J., Shilling, R., Hagan, J., Wiederhold, B.K.: Combat Related Post-Traumatic Stress Disorder: A Multiple Case Report using Virtual Reality Graded Exposure Therapy with Physiological Monitoring. Medicine Meets Virtual Reality 16, 556–561 (2008)Google Scholar
  6. 6.
    Russell, J.A.: A Circumplex Model of Affect. Journal of Personality and Social Psychology 39, 1161–1178 (1980)CrossRefGoogle Scholar
  7. 7.
    Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual. Technical Report A-6. University of Florida, Gainesville, FL (2005)Google Scholar
  8. 8.
    Bradley, M.M., Lang, P.J.: The International Affective Digitized Sounds, 2nd edn. (IADS-2): Affective Ratings of Sounds and Instruction Manual. Technical report B-3. University of Florida, Gainesville, FL (2007)Google Scholar
  9. 9.
    Horvat, M., Popović, S., Bogunović, N., Ćosić, K.: Tagging Multimedia Stimuli with Ontologies. In: MIPRO (submitted, 2009)Google Scholar
  10. 10.
    Popović, S., Horvat, M., Ćosić, K.: Generator of Audio-Visual Stimuli for Psychotherapy and Psychological Training. In: Kozarić-Kovačić, D. (ed.) 3rd Croatian Symposium on Stress-Related Disorders, p. 40. University Hospital Dubrava, Zagreb (2008) (in Croatian)Google Scholar
  11. 11.
    Popović, S., Đukić, Z., Križek, V., Kukolja, D., Ćosić, K., Slamić, M.: Control of Emotional Characteristics of Stimuli in Virtual Reality Based Psychotherapy. In: Budin, L., Ribarić, S. (eds.) Proceedings of the 30th Jubilee International Convention MIPRO. CTS & CIS, vol. 3, pp. 91–96. MIPRO, Rijeka (2007)Google Scholar
  12. 12.
    Bradley, M.M., Lang, P.J.: Emotion and Motivation. In: Cacioppo, J.T., Tassinary, L.G., Bernston, G.G. (eds.) Handbook of Psychophysiology, 3rd edn., pp. 581–607. Cambridge University Press, Cambridge (2007)CrossRefGoogle Scholar
  13. 13.
    Dawson, M.E., Schell, A.M., Filion, D.L.: The Electrodermal System. In: Cacioppo, J.T., Tassinary, L.G., Bernston, G.G. (eds.) Handbook of Psychophysiology, 3rd edn., pp. 159–181. Cambridge University Press, Cambridge (2007)Google Scholar
  14. 14.
    Lim, C.L., Rennie, C., Barry, R.J., Bahramali, H., Lazzaro, I., Manor, B., Gordon, E.: Decomposing Skin Conductance into Tonic and Phasic Components. International Journal of Psychophysiology 25, 97–109 (1997)CrossRefPubMedGoogle Scholar
  15. 15.
    ECGSYN - A realistic ECG waveform generator, http://www.physionet.org/physiotools/ecgsyn/
  16. 16.
    McSharry, P.E., Clifford, G.D., Tarassenko, L., Smith, L.A.: A Dynamical Model for Generating Synthetic Electrocardiogram Signals. IEEE Trans. Biomed. Eng. 50(3), 289–294 (2003)CrossRefPubMedGoogle Scholar
  17. 17.
    Lisetti, C., Nasoz, F.: Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals. EURASIP Journal on Applied Signal Processing 11, 1672–1687 (2004)CrossRefGoogle Scholar
  18. 18.
    Wagner, J., Kim, J., André, E.: From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification. In: IEEE International Conference on Multimedia and Expo, pp. 940–943 (2005)Google Scholar
  19. 19.
    Rani, P., Liu, C., Sarkar, N., Vanman, E.: An Empirical Study of Machine Learning Techniques for Affect Recognition in Human-Robot Interaction. Pattern Analysis & Applications 9, 58–69 (2006)CrossRefGoogle Scholar
  20. 20.
    Healey, J.: Wearable and Automotive Systems for Affect Recognition from Physiology. Technical report, Department of Computer Science, MIT, Cambridge, MA (2000)Google Scholar
  21. 21.
    Kim, K.H., Bang, S.W., Kim, S.R.: Emotion Recognition System using Short-Term Monitoring of Physiological Signals. Medical & Biological Engineering & Computing 42(3), 419–427 (2004)CrossRefGoogle Scholar
  22. 22.
    Haag, A., Goronzy, S., Schaich, P., Williams, J.: Emotion Recognition using Bio-Sensors: First Step Towards an Automatic System. In: Affective Dialogue Systems, Tutorial and Research Workshop, Kloster Irsee, Germany, June 14-16 (2004)Google Scholar
  23. 23.
    Leon, E., Clarke, G., Callaghan, V., Sepulveda, F.: A User-Independent Real-Time Emotion Recognition System for Software Agents in Domestic Environments. Engineering Applications of Artificial Intelligence 20(3), 337–345 (2007)CrossRefGoogle Scholar
  24. 24.
    Kaya, N., Epps, H.H.: Relationship between Color and Emotion: A Study of College Students. College Student Journal 38(3), 396–405 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Davor Kukolja
    • 1
  • Siniša Popović
    • 1
  • Branimir Dropuljić
    • 1
  • Marko Horvat
    • 1
  • Krešimir Ćosić
    • 1
  1. 1.Faculty of Electrical Engineering and ComputingUniversity of ZagrebZagrebCroatia

Personalised recommendations