Advertisement

Emotion Sharing with the Emotional Digital Picture Frame

  • Kyoung Shin Park
  • Yongjoo Cho
  • Minyoung Kim
  • Ki-Young Seo
  • Dongkeun Kim
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8008)

Abstract

This paper presents the design and implementation of emotional digital picture frame system, which is designed for a group of users to share their emotions via photographs with their own emotional expressions. This system detects user emotions using physiological sensor signals in real-time and changes audio-visual elements of photographs dynamically in response to the user’s emotional state. This system allows user emotions to be shared with other users in remote locations. Also, it provides the emotional rule authoring tool to enable users to create their own expression for audio-visual element to fit their emotion. In particular, the rendering elements of a photograph can appear differently when another user’s emotion is received.

Keywords

Emotional Digital Picture Frame Emotional Intelligent Contents Emotional Rule Authoring Tool 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kim, M., Park, K.S., Cho, Y.: Design and Implementation of the iOS-based Game Framework for Emotional Character Expression. Journal of Korean Society for Computer Game 24(1) (March 2011)Google Scholar
  2. 2.
    Sondhi, G., Sloane, A.: Digital Photo Sharing and Emotions in a Ubiquitous Smart Home. In: Venkatesh, A., Gonsalves, T., Monk, A., Buckner, K. (eds.) Home Informatics and Telematics: ICT for the Next Billion. IFIP, vol. 241, pp. 185–200. Springer, Boston (2007)CrossRefGoogle Scholar
  3. 3.
    Chang, A., Resner, B., Koerner, B., Wang, X., Ishii, H.: LumiTouch: an emotional communication device. In: Extended Abstracts on ACM Human Factors in Computing Systems, CHI (2001)Google Scholar
  4. 4.
    Neviarouskaya, A., Prendinger, H., Ishizuka, M.: EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text. In: Advanced in Human-Computer Interaction (2010)Google Scholar
  5. 5.
    Neyem, A., Aracena, C., Collazos, C.A., Alarcón, R.: Designing Emotional Awareness Devices: What One Sees is What One Feels. Revista Chilena de Ingenieria 15(3), 227–235 (2007)Google Scholar
  6. 6.
    Richard, S.: Lazarus, Emotion and Adaptation, pp. 15–29. Oxford University Press (1991)Google Scholar
  7. 7.
    Schneiderman, B.: Designing the User Interfaces: Strategies for Effective Human-Computer Interaction, 3rd edn. Addison-Wesley, Reading (1998)Google Scholar
  8. 8.
    Posner, J., Russell, J.A., Peterson, B.S.: The circumplex model of affect: An integrative approach to affective neuro science, cognitive development and psychopathology, vol. 17, pp. 715–734. Cam bridge University Press (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Kyoung Shin Park
    • 1
  • Yongjoo Cho
    • 2
  • Minyoung Kim
    • 3
  • Ki-Young Seo
    • 4
  • Dongkeun Kim
    • 2
  1. 1.Department of Multimedia EngineeringDankook UniversityKorea
  2. 2.Division of Digital MediaSangmyung UniversityKorea
  3. 3.Department of Computer ScienceSangmyung UniversityKorea
  4. 4.Department of Computer ScienceDankook UniversityKorea

Personalised recommendations