Skip to main content

Conveying Emotion with Moving Images: Relationship between Movement and Emotion

  • Conference paper
Affective Computing and Intelligent Interaction (ACII 2011)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 6974))

Abstract

We investigated the relationship between movement and conveying emotion with moving images. We developed a software system for generating moving images in which movement is specified with moving effects consisting of a few elements. We prepared eight movements from the Vertex Noise moving effect, which consists of the three elements of speed, density, and strength, by giving each element different values and combined them with still images and sound data to generate moving images that would convey emotions. Subjects looked at moving images without sound and determined whether they felt certain emotions. The results showed the the higher density value affects the conveyance of any emotions with moving images, and strength distinguishes the conveyance of anger from fear and sadness. Anger is the most recognizable emotion, and fear and sadness are difficult to distinguish from movements.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Attneave, F., Arnoult, M.D.: The Quantitative study of shape and pattern perception. Psychological Bulletin 53(6), 452–471 (1956)

    Article  Google Scholar 

  2. Blum, F.: Digital Interactive Installations: Programming interactive installations using the software package Max/MSP/Jitter. VDM Verlag (2007)

    Google Scholar 

  3. Castellano, G., Bresin, R., Camurri, A., Volpe, G.: User-centered control of audio and visual expressive feedback by full-body movements. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS, vol. 4738, pp. 501–510. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  4. Collin, C.A., McMullen, P.A.: Using Matlab to generate families of similar Attneave shapes. Behavior Research Methods 34(1), 55–68 (2002)

    Article  Google Scholar 

  5. Eyben, F., Wöllmer, M., Graves, A.: On-line emotion recognition in a 3-D activation-valence-time continuum using acoustic and linguistic cues. Journal on Multimodal User Interfaces 3-1-2, 7–19 (2010)

    Google Scholar 

  6. Fellenz, W.A., Taylor, J.G., Tsapatsoulis, N., Kollias, S.: Comparing template-based, feature-based and supervised classification of facial expressions from static images. In: Proc. of Circuits, Systems, Communications and Computers, pp. 5331–5336 (1999)

    Google Scholar 

  7. Hiraga, R., Kato, N., Matsuda, N.: Effect of visual representation in recognizing emotion expressed in a musical performance. In: Proc. of IEEE ICSMC, pp. 131–136. IEEE, Los Alamitos (2008)

    Google Scholar 

  8. Hiraga, R., Takahashi, K., Kato, N.: Toward music communication with supplemental visual information. In: Proc. of ICoMusic 2009, pp. 34–37 (2009)

    Google Scholar 

  9. Hiraga, R., Kato, N.: First steps toward determining the role of visual information in music communication. Ubiquitous Computing and Communication Journal, Special Issue of Media Solutions that Improve Accessibility to Disabled Users, 32–41 (2010)

    Google Scholar 

  10. Hiraga, R., Matsuda, N.: Emotion-intending drawings–relationship between image properties and emotion elicited. In: Proc. of IEEE ICSMC, pp. 2163–2168. IEEE, Los Alamitos (2010)

    Google Scholar 

  11. Hupont, I., Cerezo, E., Baldassarri, S.: Sensing facial emotions in a continuous 2D affective space. In: Proc. of IEEE ICSMC, pp. 2045–2051. IEEE, Los Alamitos (2010)

    Google Scholar 

  12. Juslin, P.N.: Communicating emotion in music performance: a review and a theoretical framework. In: Juslin, P.N., Sloboda, J.A. (eds.) Music and Emotion, Theory and Research, pp. 309–340. Oxford University Press, Oxford (2004)

    Google Scholar 

  13. Kumano, S., Otsuka, K., Yamato, J., Maeda, E., Sato, Y.: Pose-invariant facial expression recognition using variable-intensity templates. Int’l Journal of Computer Vision 83(2), 178–194 (2009)

    Article  Google Scholar 

  14. Miyama, C., Rai, T., Matsuda, S., Ando, D.: Introduction of DIPS programming technique. In: Proc. of ICMC, pp. 459–462. ICMA (2003)

    Google Scholar 

  15. Oyama, T., Miyano, H., Yamada, H.: Multidimensional scaling of computer-generated abstract forms. In: Proc. of the Int’l Meeting of the Psychometric Society, p. 551 (2001)

    Google Scholar 

  16. Parke, R., Chew, E., Kyriakakis, C.: Quantitative and visual analysis of the impact of music on perceived emotion of film. ACM Computers in Entertainment 5(3) (2007)

    Google Scholar 

  17. Pavlova, M., Sokolov, A., Sokolov, A.: Perceived dynamics of static images enables emotional attribution. Perception 34, 1107–1116 (2005)

    Article  Google Scholar 

  18. Schubert, E., Fabian, D.: An experimental investigation of emotional character portrayed by piano versus harpsichord performances of a J.S. Bach excerpt. In: Mackinlay, E. (ed.) Aesthetics and Experience in Music Performance, pp. 77–94. Cambridge Scholars Press (2005)

    Google Scholar 

  19. Senju, M., Ogushi, K.: How are the player’s ideas conveyed to the audience? Music Perception 4(4), 311–323 (1987)

    Article  Google Scholar 

  20. Rost, R.J., Licea-Kane, B.: OpenGL Shading Language, 3rd edn. Addison Wesley, Reading (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hiraga, R., Takahashi, K. (2011). Conveying Emotion with Moving Images: Relationship between Movement and Emotion. In: D’Mello, S., Graesser, A., Schuller, B., Martin, JC. (eds) Affective Computing and Intelligent Interaction. ACII 2011. Lecture Notes in Computer Science, vol 6974. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24600-5_59

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-24600-5_59

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-24599-2

  • Online ISBN: 978-3-642-24600-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics