Skip to main content
Log in

Accessing movies based on emotional impact

  • Original Article
  • Published:
Multimedia Systems Aims and scope Submit manuscript

Abstract

Emotions play a central role in our daily lives, influencing the way we think and act, our health and sense of well-being, and films are by excellence the form of art that exploits our affective, perceptual and intellectual activity, holding the potential for a significant impact. Video is becoming a dominant and pervasive medium, and online video a growing entertainment activity on the web and iTV, mainly due to technological developments and the trends for media convergence. In addition, the improvement of new techniques for gathering emotional information about videos, both through content analysis or user implicit feedback through user physiological signals complemented in manual labeling from users, is revealing new ways for exploring emotional information in videos, films or TV series, and brings out new perspectives to enrich and personalize video access. In this work, we reflect on the power that emotions have in our lives, on the emotional impact of movies, and on how to address this emotional dimension in the way we classify and access movies, by exploring and evaluating the design of iFelt in its different ways to classify, access, browse and visualize movies based on their emotional impact.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Ahlberg, C., Truvé, S.: Tight coupling: guiding user actions in a direct manipulation retrieval system. In: People and computers X: proceedings of HCI’95, Huddersfield, pp. 305–321, Aug 1995

  2. Ahlberg, C., Shneiderman, B.: Film Finder website. www.infovis-wiki.net/index.php?title=Film_Finder

  3. Arijon, D.: Grammar of the film language. Focal Press, Waltham (1976)

    Google Scholar 

  4. Ashby, G.F., Valentin, V.V., Turken, U.: The effects of positive affect and arousal on working memory and executive attention. In: Moore, S.C., Oaksford, M. (eds.) Emotional Cognition: from Brain to Behaviour. Amsterdam [u.a.]: Benjamins, pp. 245–287 (2002)

  5. Banerjee, S., Greene, K., Krcmar, M., Bagdasarov, Z., Ruginyte, D.: The role of gender and sensation seeking in film choice: exploring mood and arousal. J. Media Psychol. Theories Methods Appl. 20(3), 97–105 (2008)

    Article  Google Scholar 

  6. Bestiario, Videosphere, May 2008. http://www.bestiario.org/research/videosphere/

  7. Card, S.K., Mackinlay, J.D., Shneiderman, B.: Readings in Information Visualization: Using Vision to Think. Morgan-Kaufmann, San Francisco (1999)

    Google Scholar 

  8. Chambel, T., Guimarães, N.: Context perception in video-based hypermedia spaces. In: Proceedings of ACM Hypertext’02, College Park, Maryland, USA (2002)

  9. Chambel, T., Oliveira, E., Martins, P.: Being happy, healthy and whole watching movies that affect our emotions. In: Proceedings of ACII 2011, 4th International Conference on Affective Computing and Intelligent Interaction. Springer, Berlin, Heidelberg, pp. 35–45, Memphis, TN, USA, Oct 9–12 (2011)

  10. Cunningham, S., Nichols, D.M.: How people find videos. In: Proceedings of the 8th ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL ‘08). ACM, NY, USA, pp. 201–210 (2008)

  11. Doherty, J., Girgensohn, A., Helfman, J., Shipman, F., Wilcox, L.: Detail-on demand hypervideo. In: Multimedia’03: Proceedings of the Eleventh ACM International Conference on Multimedia (2003)

  12. Ekman, P.: Are there basic emotions? Psychol. Rev. 99(3), 550–553 (1992)

    Article  Google Scholar 

  13. Emotionally}Vague. http://www.emotionallyvague.com/

  14. Few, S.: Data Visualization: Past, Present, and Future. IBM Cognos Innovation Center, Ottawa (2007)

    Google Scholar 

  15. Hanjalic, A., Xu, L.Q.: Affective video content representation and modeling. Multimedia, IEEE Trans. Multimed., 7(1), 143–154 (2005)

    Google Scholar 

  16. Harris, J., Kamvar, S.: We Feel Fine: An Almanac of Human Emotion. Scribner, New York (2009)

    Google Scholar 

  17. Hauptmann, A.G.: Lessons for the future from a decade of informedia video analysis research. In: International Conference on Image and Video Retrieval, National University of Singapore, Singapore, LNCS, vol 3568, pp. 1–10. Springer, Berlin, Heidelberg, July 20–22 (2005). http://dl.acm.org/citation.cfm?id=2106102

  18. IMDb—Internet movie database. www.imdb.com

  19. Kang, H.B.: Affective content detection using HMMs. In: Proceedings of the Eleventh ACM International Conference on Multimedia, pp. 259–262 (2003)

  20. Kim, J., André, E.: Emotion recognition based on physiological changes in listening music. IEEE Trans. Pattern Anal. Mach. Intell. 30(12), 2067–2083 (2008)

    Article  Google Scholar 

  21. Kreibig, S.D., Wilhelm, F.H., Roth, W.T., Gross, J.J.: Cardiovascular, electrodermal, and respiratory response patterns to fear- and sadness-inducing films. Psychophysiology 44(5), 787–806 (2007)

    Article  Google Scholar 

  22. Langlois, T., Chambel, T., Oliveira, E., Carvalho, P., Marques, G., Falcão, A.: VIRUS: video information retrieval using subtitles. In: Proceedings of Academic MindTrek’2010, Tampere, Finland, 6–8 Oct 2010

  23. Lund, A.M.: Measuring usability with the USE questionnaire. Usability User Exp. 8(2), 8 (2001)

    MathSciNet  Google Scholar 

  24. Maaoui, C., Pruski, A., Abdat, F.: Emotion recognition for human-machine communication. In: 2008 IEEERSJ International Conference on Intelligent Robots and Systems, pp. 1210–1215. IEEE. Retrieved from http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4650870 (2008)

  25. Martinho, J., Chambel, T.: ColorsInMotion: interactive visualization and exploration of video spaces. In Proceedings of Academic MindTrek’2009, Tampere, Finland, Sep–Oct 2009

  26. Mauss, I.B., Robinson, M.D.: Measures of emotion: a review. Cogn. Emot. 23(2), 209–237 (2009)

    Article  Google Scholar 

  27. Metz, C., Taylor, M.: Film language: a semiotics of the cinema. University of Chicago Press, Chicago (1991)

    Google Scholar 

  28. Money, A.G., Agius, H.: Analysing user physiological responses for affective video summarization. Displays 30(2), 3059–3070 (2009)

    Article  Google Scholar 

  29. Neisser, U.: Cognition and Reality: Principles and Implications of Cognitive Psychology. W. H. Freeman and Company (1976). http://www.amazon.com/Cognition-Reality-Principles-Implications-Psychology/dp/0716704773

  30. NetFlix. http://www.netflix.com/

  31. Oliveira, E., Benovoy, M., Ribeiro, N., Chambel, T.: Towards emotional interaction: using movies to automatically learn users’ emotional states. In: Proceedings of Interact’ 2011: 13th IFIP TC13 International Conference on Human–Computer Interaction, Lisbon, Portugal, pp. 152–161, 5–9 Sep (2011)

  32. Ordelman, R., de Jong, F., Larson, M.: Enhanced multimedia content access and exploitation using semantic speech retrieval. In: Third IEEE International Conference on Semantic Computing, ICSC, Berkeley, CA, USA, 14–16 Sep (2009). http://doc.utwente.nl/68255/

  33. Perez, S.: The best tools for visualization. http://www.readwriteweb.com/archives/the_best_tools_for_visualization.php (2008)

  34. Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 1175–1191 (2001)

  35. Plutchik, R.: Emotion: A Psychoevolutionary Synthesis. Harper & Row, New York (1980)

    Google Scholar 

  36. Rainville, P., Bechara, A., Naqvi, N., Damasio, A.R.: Basic emotions are associated with distinct patterns of cardiorespiratory activity. Int. J. Psychophysiol. 61(1), 5–18 (2006). http://www.ncbi.nlm.nih.gov/pubmed/16439033

    Google Scholar 

  37. Rocha, T., Chambel,T.: VideoSpace: a 3D video experience. In: Proceedings of Artech’2008, 4th International Conference on Digital Arts, Porto, Portugal, Nov (2008)

  38. Rottenberg, J., Ray, R., Gross, J.: Emotion Elicitation Using Films. The Handbook of Emotion Elicitation and Assessment, pp. 9–28. Oxford University Press, USA (2005). http://www.amazon.com/Handbook-Emotion-Elicitation-Assessment-Affective/dp/0195169158

  39. Russell, J.: A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980)

    Article  Google Scholar 

  40. Scherer, K.R.: What are emotions? and how can they be measured? Soc. Sci. Inf. 44(4), 695 (2005)

    Article  Google Scholar 

  41. Shneiderman, B.: Tree visualization with tree-maps: 2-D space-filling approach. ACM Trans. Graph. (TOG) 11(1), 92–99 (1992)

    Article  MATH  Google Scholar 

  42. Smeaton, A.F., Rothwell, S.: Biometric responses to music-rich segments in films: the cdvplex. Biometric Responses to Music-Rich Segments in Films: The Cdvplex, pp. 162–168 (2009). doi:10.1109/CBMI.2009.21

  43. Snoek, C.G.M., Worring, M., Smeulders, A.W.M., Freiburg, B.: The role of visual content and style for concert video indexing. In: Proceedings of IEEE International Conference on Multimedia and Expo (2007)

  44. Soleymani, M. S., Chanel, C.G., Kierkels, J.K., Pun, T.P.: Affective characterization of movie scenes based on content analysis and physiological changes. In: International Symposium on Multimedia, pp. 228–235 (2008)

  45. Synesketch. www.synesketch.krcadinac.com

  46. Vimeo. http://vimeo.com/

  47. Ware, C.: Information Visualization: Perception for Design. Morgan Kaufmann Publisher, San Francisco (2004)

    Google Scholar 

  48. We Feel Fine. www.wefeelfine.org

  49. Xu, M., Jin, J.S., Luo, S.: Personalized video adaptation based on video content analysis. In: Proceedings of the 9th International Workshop on Multimedia Data Mining: held in conjunction with the ACM SIGKDD (MDM ‘08). ACM, New York, NY, USA, pp. 26–35 (2008)

  50. Yoo, H., Cho, S.: Video scene retrieval with interactive genetic algorithm. Multimed. Tools Appl. 34(September), 317–336 (2007)

    Article  Google Scholar 

  51. YouTube. www.youtube.com

  52. Zwaag, M., Broek, E.: Guidelines for biosignal-driven HCI. In: CHI Workshop on Brain, Body and Bytes: Psychophysiological User Interaction (2010)

Download references

Acknowledgments

This work is partially supported by FCT through LASIGE Multiannual Funding and VIRUS research project (PTDC/EIA–EIA/101012/2008).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Teresa Chambel.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Oliveira, E., Martins, P. & Chambel, T. Accessing movies based on emotional impact. Multimedia Systems 19, 559–576 (2013). https://doi.org/10.1007/s00530-013-0303-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00530-013-0303-7

Keywords

Navigation