Analyzing Video Information by Monitoring Bioelectric Signals

  • Natalya Filatova
  • Konstantin Sidorov
  • Pavel Shemaev
  • Igor Rebrun
  • Natalya Bodrina
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 875)


The paper considers the problems of creating tools for video information flow automatic analysis through monitoring certain characteristics of bioelectrical signals of an agent (expert, operator).

The authors used spectral analysis methods to obtain time series that illustrate power changes of experimental bioelectric signals (\( Sa \)) of an agent during studying a continuous video stream. The fuzzification of spectral features allows moving to fuzzy time series illustrating information evaluation by an agent.

The spectra are calculated using a sliding working window from fragments of two types of bioelectric signals, which are recorded synchronously in two autonomous channels.

Based on fuzzy evaluations of spectral features and Mamdani fuzzy inference algorithm, the algorithm for analyzing video information allows classifying video fragments according to the sign of agent’s emotional reaction. Tsukamoto algorithm localizes time markers that determine the beginning and the end of each fragment.


Signal analysis Fuzzy set Fuzzy signs Human emotions Spectral analysis 


  1. 1.
    Vijaya, K., Sridhar, I.: Automated tagging to enable fine-grained browsing of lecture videos. In: 2011 IEEE International Conference on Technology for Education (T4E), pp. 96–102. IEEE Computer Society, IIT Madras, Chennai (2011)Google Scholar
  2. 2.
    Wang, M., Hong, R., Li, G., Zha, Z.-J., Yan, S., Chua, T.-S.: Event driven web video summarization by tag localization and key-shot identification. IEEE Trans. Multimedia 14(4), 975–985 (2012). Scholar
  3. 3.
    Chen, X., Hero, A.O., Savarese, S.: Multimodal video indexing and retrieval using directed information. IEEE Trans. Multimedia 14(1), 3–16 (2012). Scholar
  4. 4.
    Zhang, T., Xu, C., Zhu, G., Liu, S., Lu, H.: A generic framework for video annotation via semi-supervised learning. IEEE Trans. Multimedia 14(4), 1206–1219 (2012). Scholar
  5. 5.
    Yu, H.Q., Pedrinaci, C., Dietze, S., Domingue, J.: Using linked data to annotate and search educational video resources for supporting distance learning. IEEE Trans. Learn. Technol. 5(2), 130–142 (2012). Scholar
  6. 6.
    Nikitin, I.K.: An overview of complex content-based video retrieval methods. Bull. Novosibirsk State Univ. Inf. Technol. Series 12(4), 71–82 (2014). (in Russ., Vestnik NGU, seriya: Informatsionnyye tekhnologii)Google Scholar
  7. 7.
    Tamizharasan, C., Chandrakala, S.: A survey on multimodal content based video retrieval. Int. J. Emerg. Technol. Adv. Eng. 3(1), 69–75 (2013)Google Scholar
  8. 8.
    Abhang, P.A., Gawali, B.W.: Correlation of EEG images and speech signals for emotion analysis. Br. J. Appl. Sci. Technol. 10(5), 1–13 (2015). Scholar
  9. 9.
    Filatova, N.N., Sidorov, K.V., Iliasov, L.V.: Automated system for analyzing and interpreting nonverbal information. Int. J. Appl. Eng. Res. 10(24), 45741–45749 (2015)Google Scholar
  10. 10.
    Koelstra, S.: Affective and Implicit Tagging using Facial Expressions and Electroencephalography. Ph.d. Dissertation. Queen Mary University of London (2012)Google Scholar
  11. 11.
    Lokannavar, S., Lahane, P., Gangurde, A., Chidre, P.: Emotion recognition using EEG signals. Int. J. Adv. Res. Comput. Commun. Eng. 4(5), 54–56 (2015). Scholar
  12. 12.
    Baars, B.J., Gage, N.M.: Cognition, Brain, and Consciousness, Introduction to Cognitive Neuroscience, 2nd edn. Elsevier Ltd., Oxford (2010)Google Scholar
  13. 13.
    Marsella, S., Gratch, J.: EMA: a computational model of appraisal dynamics. Cogn. Syst. Res. 10(1), 70–90 (2009). Scholar
  14. 14.
    Ortony, A., Clore, G.L., Collins, A.: The Cognitive Structure of Emotions. Cambridge University Press, Cambridge (1990)Google Scholar
  15. 15.
    Lazarus, R.S.: Cognition and motivation in emotion. Am. Psychol. 46, 352–367 (1991). Scholar
  16. 16.
    Rabinovich, M.I., Muezzinoglu, M.K.: Nonlinear dynamics of the brain: emotion and cognition. Adv. Phy. Sci. 180(4), 371–387 (2010). (in Russ., Uspekhi Fizicheskikh Nauk)CrossRefGoogle Scholar
  17. 17.
    Isen, A.M., Daubman, K.A., Nowicki, G.P.: Positive affect facilitates creative problem solving. J. Pers. Soc. Psychol. 52(6), 1122–1131 (1987). Scholar
  18. 18.
    Filatova, N.N., Sidorov, K.V., Terekhin, S.A., Vinogradov, G.P.: The system for the study of the dynamics of human emotional response using fuzzy trends. In: Abraham, A., et al. (Eds.): Proceedings of the First International Scientific Conference Intelligent Information Technologies for Industry (IITI 2016), Advances in Intelligent Systems and Computing 451, vol. 2, part III, pp. 175–184. Springer, Switzerland (2016). Scholar
  19. 19.
    Dimberg, U., Petterson, M.: Facial reactions to happy and angry facial expressions: Evidence for right hemisphere dominance. Psychophysiology 37(5), 693–696 (2000). Scholar
  20. 20.
    Fridlund, A.J., Cacioppo, J.T.: Guidelines for human electromyographic research. Psychophysiology 23(5), 567–589 (1986). Scholar
  21. 21.
    Muller, M.M., Keil, A., Gruber, Th, Elbert, Th: Processing of affective pictures modulates right-hemispheric gamma band EEG activity. Clin. Neurophysiol. 110(11), 1913–1920 (1999). Scholar
  22. 22.
    Borisov, V.V., Fedulov, A.S., Zernov, M.M.: Fundamentals of Fuzzy Inference. Hot Line - Telecom, Moscow (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Tver State Technical UniversityTverRussia

Personalised recommendations