Enhancing Emotion Recognition in VIPs with Haptic Feedback
- 1.2k Downloads
The rise of smart technologies has created new opportunities to support blind and visually impaired persons (VIPs). One of the biggest problems we identified in our previous research on problems VIPs face during activities of daily life concerned the recognition of persons and their facial expressions. In this study we developed a system to detect faces, recognize their emotions, and provide vibrotactile feedback about the emotions expressed. The prototype system was tested to determine whether vibrotactile feedback through a haptic belt is capable of enhancing social interactions for VIPs.
The system consisted of commercially available technologies. A Logitech C920 webcam mounted on a cap, a Microsoft Surface Pro 4 carried in a mesh backpack, an Elitac tactile belt worn around the waist, and the VicarVision FaceReader software application, which recognizes facial expressions.
In preliminary tests with the systems both visually impaired and sighted persons were presented with sets of stimuli consisting of actors displaying six emotions (e.g. joy, surprise, anger, sadness, fear, and disgust) derived from the validated Amsterdam Dynamic Facial Expression Set and Warsaw Set of Emotional Facial Expression Pictures with matching audio by using nonlinguistic affect bursts. Subjects had to determine the emotions expressed in the videos without and, after a training period, with haptic feedback.
An exit survey was conducted aimed to gain insights into the opinion of the users, on the perceived usefulness and benefits of the emotional feedback, and their willingness of using the prototype as assistive technology in daily life.
Haptic feedback about facial expressions may improve the ability of VIPs to determine emotions expressed by others and, as a result, increase the confidence of VIPs during social interactions. More studies are needed to determine whether this is a viable method to convey information and enhance social interactions in the daily life of VIPs.
KeywordsSensory substitution Wearables User-centered design
- 1.Van der Geest, T.M. Buimer, H.P.: User-centered priority setting for accessible devices and applications. in Mensch und Computer 2015. Stuttgart: De Gruyter, Oldenbourg ( 2015)Google Scholar
- 2.Krishna, S. et al.: A systematic requirements analysis and development of an assistive device to enhance the social interaction of people who are blind of visually impaired, in Computer Vision Applications for the Visually Impaired, Marseille, France (2008)Google Scholar
- 5.McDaniel, T. et al.: Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. In: HAVE 2008, Ottawa (2008)Google Scholar
- 6.McDaniel, T. et al.: Heartbeats: a methodology to convey interpersonal distance through touch. In: CHI, Atlanta, Georgia, USA (2010)Google Scholar
- 9.Cootes, T., Taylor, C.: Statistical Models of Appearance for Computer Vision (2000)Google Scholar
- 12.Denu Uyl, M.J., Van Kuilenburg, H.: The FaceReader: online facial expression recognition. In: Measuring Behaviour. Wageningen, The Netherlands (2005)Google Scholar
- 13.Van Kuilenburg, H., et al.: Advances in face and gesture analysis. In: Measuring Behavior. Maastricht, The Netherlands (2008)Google Scholar