Skip to main content

Exploring the Usage of EEG and Pupil Diameter to Detect Elicited Valence

  • Conference paper
  • First Online:
Intelligent Human Systems Integration (IHSI 2018)

Abstract

Brain signals are a reliable information source because human beings have limited voluntary control over. We examine EEG readings as a reporting tool concerning human emotions. We examine whether readings from an eye tracker, can enhance the results. We conducted an experiment on 25 users to measure their EEG signals in response to emotional stimuli. All sensors used were off-the-shelf, to test our method using cheap sensors. We used pleasant and unpleasant videos content to elicit emotional responses. Along with Self-Assessment Mannequin (SAM), Alpha symmetry index readings, and pupil diameter, were recorded. Our results show a significant difference in the video clips eliciting different emotions. This implies that EEG can be a valid way to detect emotional state, especially when combined with eye-tracker. We conclude from our findings that EEG can be used as a platform, upon which reliable affect-aware systems and applications can be built.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://goo.gl/8J78Gq.

  2. 2.

    https://www.emotiv.com/product/emotiv-epoc-14-channel-mobile-eeg/.

  3. 3.

    https://pupil-labs.com/pupil/.

  4. 4.

    https://www.mathworks.com/products/new_products/release2015a.html.

References

  1. ElKomy, M., Abdelrahman, Y., Funk, M., Dingler, T., Schmidt, A., Abdennadher, S.: ABBAS: an adaptive bio-sensors based assistive system. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2543–2550. ACM (2017)

    Google Scholar 

  2. Abdelrahman, Y., Hassib, M., Marquez, M.G., Funk, M., Schmidt, A.: Implicit engagement detection for interactive museums using brain-computer interfaces. In: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, pp. 838–845. ACM (2015)

    Google Scholar 

  3. Abdelrahman, Y., Velloso, E., Dingler, T., Schmidt, A., Vetere, F.: Cognitive heat: exploring the usage of thermal imaging to unobtrusively estimate cognitive load. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 1, no. 33, p. 20 (2017)

    Google Scholar 

  4. Kassem, K., Salah, J., Abdrabou, Y., Morsy, M., ElGendy, R., Abdelrahman, Y., Abdennadher, S.: DiVA: exploring the usage of pupil diameter to elicit valence and arousal. In: Proceedings of the 16th International Conference on MUM. ACM (2017)

    Google Scholar 

  5. Ng, R., Fishman, I., Bellugi, U.: Frontal asymmetry index in Williams syndrome: evidence for altered emotional brain circuitry? Soc. Neurosci. 10(4), 366–375 (2015)

    Google Scholar 

  6. Bos, D.O., et al.: EEG-based emotion recognition. The influence of visual and auditory stimuli, vol. 56, no. 3, pp. 1–17 (2006)

    Google Scholar 

  7. Liu, Y., Sourina, O., Nguyen, M.K.: Real-time EEG-based emotion recognition and its applications. In: Transactions on Computational Science, vol. XII, pp. 256–277. Springer (2011)

    Google Scholar 

  8. Soleymani, M., Pantic, M., Pun, T.: Multimodal emotion recognition in response to videos. IEEE Trans. Affect. Comput. 3(2), 211–223 (2012)

    Article  Google Scholar 

  9. Lu, Y., Zheng, W.-L., Li, B., Lu, B.-L.: Combining eye movements and EEG to enhance emotion recognition. In: IJCAI, pp. 1170–1176 (2015)

    Google Scholar 

  10. Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)

    Article  Google Scholar 

  11. Tomarken, A.J., Davidson, R.J., Wheeler, R.E., Doss, R.C.: Individual differences in anterior brain asymmetry and fundamental dimensions of emotion. J. Pers. Soc. Psychol. 62(4), 676 (1992)

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by the German Federal Ministry of Education and Research, FeuerWeRR Grant No. 13N13481, Amplify project (grant agreement no. 683008) and the German Research Foundation within the SimTech Cluster of Excellence (EXC 310/2).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yasmeen Abdrabou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Abdrabou, Y. et al. (2018). Exploring the Usage of EEG and Pupil Diameter to Detect Elicited Valence. In: Karwowski, W., Ahram, T. (eds) Intelligent Human Systems Integration. IHSI 2018. Advances in Intelligent Systems and Computing, vol 722. Springer, Cham. https://doi.org/10.1007/978-3-319-73888-8_45

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-73888-8_45

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-73887-1

  • Online ISBN: 978-3-319-73888-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics