Advertisement

Enhancing Emotion Recognition in VIPs with Haptic Feedback

  • Hendrik P. BuimerEmail author
  • Marian Bittner
  • Tjerk Kostelijk
  • Thea M. van der Geest
  • Richard J. A. van Wezel
  • Yan Zhao
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 618)

Abstract

The rise of smart technologies has created new opportunities to support blind and visually impaired persons (VIPs). One of the biggest problems we identified in our previous research on problems VIPs face during activities of daily life concerned the recognition of persons and their facial expressions. In this study we developed a system to detect faces, recognize their emotions, and provide vibrotactile feedback about the emotions expressed. The prototype system was tested to determine whether vibrotactile feedback through a haptic belt is capable of enhancing social interactions for VIPs.

The system consisted of commercially available technologies. A Logitech C920 webcam mounted on a cap, a Microsoft Surface Pro 4 carried in a mesh backpack, an Elitac tactile belt worn around the waist, and the VicarVision FaceReader software application, which recognizes facial expressions.

In preliminary tests with the systems both visually impaired and sighted persons were presented with sets of stimuli consisting of actors displaying six emotions (e.g. joy, surprise, anger, sadness, fear, and disgust) derived from the validated Amsterdam Dynamic Facial Expression Set and Warsaw Set of Emotional Facial Expression Pictures with matching audio by using nonlinguistic affect bursts. Subjects had to determine the emotions expressed in the videos without and, after a training period, with haptic feedback.

An exit survey was conducted aimed to gain insights into the opinion of the users, on the perceived usefulness and benefits of the emotional feedback, and their willingness of using the prototype as assistive technology in daily life.

Haptic feedback about facial expressions may improve the ability of VIPs to determine emotions expressed by others and, as a result, increase the confidence of VIPs during social interactions. More studies are needed to determine whether this is a viable method to convey information and enhance social interactions in the daily life of VIPs.

Keywords

Sensory substitution Wearables User-centered design 

References

  1. 1.
    Van der Geest, T.M. Buimer, H.P.: User-centered priority setting for accessible devices and applications. in Mensch und Computer 2015. Stuttgart: De Gruyter, Oldenbourg ( 2015)Google Scholar
  2. 2.
    Krishna, S. et al.: A systematic requirements analysis and development of an assistive device to enhance the social interaction of people who are blind of visually impaired, in Computer Vision Applications for the Visually Impaired, Marseille, France (2008)Google Scholar
  3. 3.
    Bach-y-Rita, P., Kercel, S.W.: Sensory substitution and the human-machine interface. Trends Cogn. Sci. 7(12), 541–546 (2003)CrossRefGoogle Scholar
  4. 4.
    Bach-y-Rita, P.: Sensory plasticity: applications to a vision substitution system. Acta Neurol. Scand. 43, 417–426 (1967)CrossRefGoogle Scholar
  5. 5.
    McDaniel, T. et al.: Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. In: HAVE 2008, Ottawa (2008)Google Scholar
  6. 6.
    McDaniel, T. et al.: Heartbeats: a methodology to convey interpersonal distance through touch. In: CHI, Atlanta, Georgia, USA (2010)Google Scholar
  7. 7.
    Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3/4), 169–200 (1992)CrossRefGoogle Scholar
  8. 8.
    Bishop, C.M.: Neural Networks for Pattern Recognition. Clarendon Press, Oxford (1995)zbMATHGoogle Scholar
  9. 9.
    Cootes, T., Taylor, C.: Statistical Models of Appearance for Computer Vision (2000)Google Scholar
  10. 10.
    Viola, P., Jones, M.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)CrossRefGoogle Scholar
  11. 11.
    van Kuilenburg, H., Wiering, M.A., den Uyl, M.: A Model Based Method for Automatic Facial Expression Recognition. In: Gama, J., Camacho, R., Brazdil, P.B., Jorge, A.M., Torgo, L. (eds.) ECML 2005. LNCS (LNAI), vol. 3720, pp. 194–205. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  12. 12.
    Denu Uyl, M.J., Van Kuilenburg, H.: The FaceReader: online facial expression recognition. In: Measuring Behaviour. Wageningen, The Netherlands (2005)Google Scholar
  13. 13.
    Van Kuilenburg, H., et al.: Advances in face and gesture analysis. In: Measuring Behavior. Maastricht, The Netherlands (2008)Google Scholar
  14. 14.
    Van der Schalk, J., et al.: Moving faces, looking places: validation of the Amsterdam dynamic facial expression set (ADFES). Emotion 11(4), 907–920 (2011)CrossRefGoogle Scholar
  15. 15.
    Olszanowski, M., et al.: Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Front. Psychol. 5, 1516 (2015)CrossRefGoogle Scholar
  16. 16.
    Hawk, S.T., et al.: “Worth a thousand words”: absolute and relative decoding of nonlinguistic affect vocalizations. Emotion 9(3), 293–305 (2009)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Lima, C.F., Castro, S.L., Scott, S.K.: When voices get emotional: A corpus of nonverbal vocalizations for research on emotion processing. Behav. Res. Methods 45, 1234–1245 (2013)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Hendrik P. Buimer
    • 1
    Email author
  • Marian Bittner
    • 1
  • Tjerk Kostelijk
    • 2
  • Thea M. van der Geest
    • 3
  • Richard J. A. van Wezel
    • 1
    • 4
  • Yan Zhao
    • 1
  1. 1.Department of Biomedical Signals and Systems, MIRA InstituteUniversity of TwenteEnschedeThe Netherlands
  2. 2.VicarVisionAmsterdamThe Netherlands
  3. 3.Department of Media, Communication and OrganizationUniversity of TwenteEnschedeThe Netherlands
  4. 4.Biophysics, Donders InstituteRadboud University NijmegenNijmegenThe Netherlands

Personalised recommendations