A More Complete Picture of Emotion Using Electrocardiogram and Electrodermal Activity to Complement Cognitive Data

  • Danushka BandaraEmail author
  • Stephen Song
  • Leanne Hirshfield
  • Senem Velipasalar
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9743)


We describe a method of achieving emotion classification using ECG and EDA data. There have been many studies conducted on usage of heart rate and EDA data to quantify the arousal level of a user [1, 2, 3]. Researchers have identified a connection between a person’s ECG data and the positivity or negativity of their emotional state [4]. The goal of this work is to extend this idea to human computer interaction domain. We will explore whether the valence/arousal level of a subject’s response to computer based stimuli is predictable using ECG and EDA, and whether or not that information can complement recordings of participants’ cognitive data to form a more accurate depiction of emotional state. The experiment consists of presenting three types of stimuli, both interactive and noninteractive, to 9 subjects and recording their physiological response via ECG and EDA data as well as fNIRS device. The stimuli were selected from validated methods of inducing emotion including DEAP dataset [5], Multi Attribute Task Battery [6] and Tetris video game [7]. The participants’ responses were captured using Self-Assessment Manikin [8] surveys which were used as the ground truth labels. The resulting data was analyzed using Machine Learning. The results provide new avenues of research in combining physiological data to classify emotion.


Electrocardiography Electrodermal activity fNIRS Valence Arousal Human computer interaction 


  1. 1.
    Lang, A.: Involuntary attention and physiological arousal evoked by structural features and emotional content in TV commercials. Commun. Res. 17(3), 275–299 (1990). doi: 10.1177/009365090017003001 CrossRefGoogle Scholar
  2. 2.
    Zillmann, D.: Excitation transfer in communication-mediated aggressive behavior. J. Exp. Soc. Psychol. 7(4), 419–434 (1971). CrossRefGoogle Scholar
  3. 3.
    Bradley, M.M., Codispoti, M., Cuthbert, B.N., Lang, P.J.: Emotion and motivation I: defensive and appetitive reactions in picture processing. Emotion 1(3), 276–298 (2001). doi: 10.1037/1528-3542.1.3.276 CrossRefGoogle Scholar
  4. 4.
    Brosschot, J.F., Thayer, J.F.: Heart rate response is longer after negative emotions than after positive emotions. Int. J. Psychophysiol. 50(3), 181–187 (2003)CrossRefGoogle Scholar
  5. 5.
    Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Comput. 42(3), 419–427 (2004)CrossRefGoogle Scholar
  6. 6.
    Koelstra, S., Muhl, C., Soleymani, M., Jong-Seok Lee, A., Yazdani, T., Ebrahimi, T., Pun, A.Nijholt, Patras, I.: DEAP: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012). doi: 10.1109/T-AFFC.2011.15 CrossRefGoogle Scholar
  7. 7.
    Comstock, J.R., Arnegard, R.J.: The Multi-attribute Task Battery for Human Operator Workload and Strategic Behavior Research. National Aeronautics and Space Administration, Langley Research Center, Hampton (1992)Google Scholar
  8. 8.
    Chanel, G., Rebetez, C., Bétrancourt, M., Pun, T.: Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 41(6), 1052–1063 (2011)CrossRefGoogle Scholar
  9. 9.
    Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)CrossRefGoogle Scholar
  10. 10.
    Plutchik, R.: The nature of emotions human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 89(4), 344–350 (2001)CrossRefGoogle Scholar
  11. 11.
    Scherer, K.R.: What are emotions? And how can they be measured? Soc. Sci. Inf. 44(4), 695–729 (2005)CrossRefGoogle Scholar
  12. 12.
    Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)CrossRefGoogle Scholar
  13. 13.
    Barrett, L.F., Lindquist, K.A.: The embodiment of emotion. In: Semin, G.R., Smith, E.R. (eds.) Embodied Grounding: Social, Cognitive, Affective, and Neuroscientific Approaches, pp. 237–262. Cambridge University Press, New York (2008)CrossRefGoogle Scholar
  14. 14.
    Andreassi, J.L.: Psychophysiology: Human Behavior and Physiological Response. Psychology Press, New York (2013)Google Scholar
  15. 15.
    Lane, R.D., McRae, K., Reiman, E.M., Chen, K., Ahern, G.L., Thayer, J.F.: Neural correlates of heart rate variability during emotion. NeuroImage 44(1), 213–222 (2009)CrossRefGoogle Scholar
  16. 16.
    Boucsein, W.: Electrodermal Activity. Springer Science & Business Media, New York (2012)CrossRefGoogle Scholar
  17. 17.
    Potter, R.F., Bolls, P.D.: Psychophysiological Measurement and Meaning: Cognitive and Emotional Processing of Media. Routledge, New York (2012)Google Scholar
  18. 18.
    Shi, Y., Ruiz, N., Taib, R., Choi, E., Chen, F.: Galvanic skin response (GSR) as an index of cognitive load. In: CHI 2007 Extended Abstracts on Human Factors in Computing Systems, pp. 2651–2656. ACM, April 2007Google Scholar
  19. 19.
    Westerink, J.H., Van Den Broek, E.L., Schut, M.H., Van Herk, J., Tuinenbreijer, K.: Computing emotion awareness through galvanic skin response and facial electromyography. In: Westerink, J.H.D.M., Ouwerkerk, M., Overbeek, T.J.M., Frank Pasveer, W., de Ruyter, B. (eds.) Probing Experience, pp. 149–162. Springer, Netherlands (2008)CrossRefGoogle Scholar
  20. 20.
    Zillmann, D.: Television Viewing and Arousal Television and Social Behavior. U.S. Government Printing Office, Washington (1981)Google Scholar
  21. 21.
    Kober, H., Barrett, L.F., Joseph, J., Bliss-Moreau, E., Lindquist, K., Wager, T.D.: Functional grouping and cortical–subcortical interactions in emotion: a meta-analysis of neuroimaging studies. NeuroImage 42(2), 998–1031 (2008). doi: 10.1016/j.neuroimage.2008.03.059 CrossRefGoogle Scholar
  22. 22.
    Dousty, M., Daneshvar, S., Haghjoo, M.: The effects of sedative music, arousal music, and silence on electrocardiography signals. J. Electrocardiol. 44(3), 396.e391–396.e396 (2011). CrossRefGoogle Scholar
  23. 23.
    AcqKnowledge [Computer Software] (2014).
  24. 24.
    Farrell, R.M., Syed, A., Syed, A., Gutterman, D.D.: Effects of limb electrode placement on the 12- and 16-lead electrocardiogram. J. Electrocardiol. 41(6), 536–545 (2008). CrossRefGoogle Scholar
  25. 25.
    Electrode placement [Online image] (2014). Accessed 10 Feb 2016
  26. 26.
    Berndt, D.J., Clifford, J.: Using dynamic time warping to find patterns in time series. In: KDD workshop, vol. 10, no. 16, pp. 359–370, July 1994Google Scholar
  27. 27.
    Lin, J., Keogh, E., Lonardi, S., Chiu, B.: A symbolic representation of time series, with implications for streaming algorithms. In: Proceedings of the 8th ACM SIGMOD Workshop on Research Issues in Data Mining and Knowledge Discovery, pp. 2–11. ACM, June 2003Google Scholar
  28. 28.
    Comstock Jr., J.R., Arnegard, R.J.: The multi-attribute task battery for human operator workload and strategic behavior research (1992)Google Scholar
  29. 29.
    McIntyre, R.C.: TetrisCSharp. Computer software (2011)Google Scholar
  30. 30.
    Selvaraj, J., Murugappan, M., Wan, K., Yaacob, S.: Classification of emotional states from electrocardiogram signals: a non-linear approach based on hurst. Biomed. Eng. Online 12, 44 (2013)CrossRefGoogle Scholar
  31. 31.
    Schmidt, L.A., Trainor, L.J., Santesso, D.L.: Development of frontal electroencephalogram (EEG) and heart rate (ECG) responses to affective musical stimuli during the first 12 months of post-natal life. Brain Cogn. 52(1), 27–32 (2003)CrossRefGoogle Scholar
  32. 32.
    Zheng, W.L., Dong, B.N., Lu, B.L.: Multimodal emotion recognition using EEG and eye tracking data. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 5040–5043. IEEE, August 2014Google Scholar
  33. 33.
    Wexler, B.E., Warrenburg, S., Schwartz, G.E., Janer, L.D.: EEG and EMG responses to emotion-evoking stimuli processed without conscious awareness. Neuropsychologia 30(12), 1065–1079 (1992)CrossRefGoogle Scholar
  34. 34.
    Hirshfield, L.M., Bobko, P., Barelka, A., Hirshfield, S.H., Farrington, M.T., Gulbronson, S., Paverman, D.: Using noninvasive brain measurement to explore the psychological effects of computer malfunctions on users during human-computer interactions. Adv. Hum. Comput. Interact. 2014, 2 (2014)CrossRefGoogle Scholar
  35. 35.
    Hirshfield, L., Costa, M., Bandara, D., Bratt, S.: Measuring situational awareness aptitude using functional near-infrared spectroscopy. In: Schmorrow, D.D., Fidopiastis, C.M. (eds.) AC 2015. LNCS, vol. 9183, pp. 244–255. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  36. 36.
    Bandara, D.S.: Emotion-Reading: Affective Analysis Using fNIRS. Accessed 24 Feb 2016
  37. 37.
    Myrtek, M., Deutschmann-Janicke, E., Strohmaier, H., Zimmermann, W., Lawerenz, S., Brügner, G., Müller, W.: Physical, mental, emotional, and subjective workload components in train drivers. Ergonomics 37(7), 1195–1203 (1994)CrossRefGoogle Scholar
  38. 38.
    Marg, E.: DESCARTES’ ERROR: emotion, reason, and the human brain. Optom. Vis. Sci. 72(11), 847–848 (1995)CrossRefGoogle Scholar
  39. 39.
    Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980)CrossRefGoogle Scholar
  40. 40.
    Chance, B., Zhuang, Z., UnAh, C., Alter, C., Lipton, L.: Cognition-activated low-frequency modulation of light absorption in human brain. Proc. Natl. Acad. Sci. 90(8), 3770–3774 (1993)CrossRefGoogle Scholar
  41. 41.
    Izzetoglu, K., Bunce, S., Onaral, B., Pourrezaei, K., Chance, B.: Functional optical brain imaging using near-infrared during cognitive tasks. Int. J. Hum. Comput. Interact. 17(2), 211–227 (2004)CrossRefGoogle Scholar
  42. 42.
    Atkielski, A.: Schematic diagram of normal sinus rhythm for a human heart as seen on ECG. Digital image. Sinus rhythm labels. Wikimedia commons (2007)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Danushka Bandara
    • 1
    • 2
    Email author
  • Stephen Song
    • 2
  • Leanne Hirshfield
    • 2
  • Senem Velipasalar
    • 1
  1. 1.Department of Electrical Engineering and Computer ScienceSyracuse UniversitySyracuseUSA
  2. 2.M.I.N.D. Lab, S.I. Newhouse School of Public CommunicationsSyracuse UniversitySyracuseUSA

Personalised recommendations