Advertisement

Deep Learning for Real Time Facial Expression Recognition in Social Robots

  • Ariel Ruiz-Garcia
  • Nicola Webb
  • Vasile Palade
  • Mark Eastwood
  • Mark Elshaw
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11305)

Abstract

Human robot interaction is a rapidly growing topic of interest in today’s society. The development of real time emotion recognition will further improve the relationship between humans and social robots. However, contemporary real time emotion recognition in unconstrained environments has yet to reach the accuracy levels achieved on controlled static datasets. In this work, we propose a Deep Convolutional Neural Network (CNN), pre-trained as a Stacked Convolutional Autoencoder (SCAE) in a greedy layer-wise unsupervised manner, for emotion recognition from facial expression images taken by a NAO robot. The SCAE model is trained to learn an illumination invariant down-sampled feature vector. The weights of the encoder element are then used to initialize the CNN model, which is fine-tuned for classification. We train the model on a corpus composed of gamma corrected versions of the CK+ , JAFFE, FEEDTUM and KDEF datasets. The emotion recognition model produces a state-of-the-art accuracy rate of 99.14% on this corpus. We also show that the proposed training approach significantly improves the CNN’s generalisation ability by over 30% on nonuniform data collected with the NAO robot in unconstrained environments.

Keywords

Deep convolutional neural networks Emotion recognition Greedy layer-wise training Social robots Stacked convolutional autoencoders 

References

  1. 1.
    Kulic, D., Croft, E.: Affective state estimation for human–robot interaction. IEEE Trans. Robot. 23, 991–1000 (2007)CrossRefGoogle Scholar
  2. 2.
    Bengio, Y., Lamblin, P., Popovici, D., Larochelle, H.: Greedy layer-wise training of deep networks. In: Proceedings of the 19th International Conference on Neural Information Processing Systems, pp. 153–160 (2006)Google Scholar
  3. 3.
    Fasola, J., Mataric, M.: A socially assistive robot exercise coach for the elderly. J. Hum. Robot Interaction 2, 3–32 (2013)CrossRefGoogle Scholar
  4. 4.
    Breazeal, C.: Toward sociable robots. Robot. Auton. Syst. 42, 167–175 (2003)CrossRefGoogle Scholar
  5. 5.
    Bartneck, C., Forlizzi, J.: A design-centred framework for social human-robot interaction. In: 13th IEEE International Workshop on Robot and Human Interactive Communication, pp. 591–594 (2004)Google Scholar
  6. 6.
    Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robot. Auton. Syst. 42, 143–166 (2003)CrossRefGoogle Scholar
  7. 7.
    Tapus, A., Mataric, M., Scassellati, B.: Socially assistive robotics [grand challenges of robotics]. IEEE Robot. Autom. Mag. 14, 35–42 (2007)CrossRefGoogle Scholar
  8. 8.
    Shamsuddin, S., Yussof, H., Ismail, L., Mohamed, S., Hanapiah, F., Zahari, N.: Initial response in HRI- a case study on evaluation of child with autism spectrum disorders interacting with a humanoid robot NAO. Proc. Eng. 41, 1448–1455 (2012)CrossRefGoogle Scholar
  9. 9.
    Burket, P., Afzal, M., Dengel, A., Liwicki, M.: Dexpression: deep convolutional neural network for expression recognition. Arxiv (2016)Google Scholar
  10. 10.
    Lucey, P., Cohn, J., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression. In: 2010 IEEE Computer Society Conference on Computer Vision And Pattern Recognition, pp. 94–101 (2010)Google Scholar
  11. 11.
    Valstar, M., Pantic, M.: Induced disgust, happiness and surprise: an addition to the mmi facial expression database. In: Proceedings of the Internation Conference on Language Resources and Evaluation, pp. 65–70 (2010)Google Scholar
  12. 12.
    Duncan, D., Shine, G., English, C.: Facial emotion recognition in real time (2017)Google Scholar
  13. 13.
    Lyons, M., Akamatsu, S., Kamachi, M., Gyoba, J.: Coding facial expressions with gabor wavelets. In: Third IEEE International Conference on Automatic Face and Gesture Recognition, pp. 200–205 (1998)Google Scholar
  14. 14.
    Gilligan, T., Akis, B.: Emotion AI, real-time emotion detection using CNN (2015)Google Scholar
  15. 15.
    Ruiz-Garcia, A., Elshaw, M., Altahhan, A., Palade, V.: A hybrid deep learning neural approach for emotion recognition from facial expressions for socially assistive robots. Neural Comput. Appl. 29, 359–373 (2018).  https://doi.org/10.1007/s00521-018-3358-8CrossRefGoogle Scholar
  16. 16.
    Ma, R., Mohamed, A.: Image processing pipeline for facial expression recognition under variable lighting (2015)Google Scholar
  17. 17.
    Lundqvist, D., Flykt, A., Öhman, A.: The karolinska directed emotional faces CD ROM from department of clinical neuroscience, psychology (1998)Google Scholar
  18. 18.
    Wallhoff, F., Schuller, B., Hawellek, M., Rigoll, G.: Efficient recognition of authentic dynamic facial expressions on the feedtum database. In: IEEE International Conference on Multimedia and Expo, pp. 493–496 (2006)Google Scholar
  19. 19.
    Déniz, O., Bueno, G., Salido, J., De la Torre, F.: Face recognition using histograms of oriented gradients. Pattern Recogn. Lett. 32, 1598–1603 (2011)CrossRefGoogle Scholar
  20. 20.
    Ruiz-Garcia, A., Palade, V., Elshaw, M., Almakky, I.: Deep learning for illumination invariant facial expression recognition. In: Proceedings of the International Joint Conference on Neural Networks, Rio de Janeiro (2018).  https://doi.org/10.1109/IJCNN.2018.8489123
  21. 21.
    Lim, N.: Cultural differences in emotion: differences in emotional arousal level between the east and the west. Integr. Med. Res. 5, 105–109 (2016)CrossRefGoogle Scholar
  22. 22.
    Kring, A., Gordon, A.: Sex differences in emotion: expression, experience, and physiology. J. Pers. Soc. Psychol. 74, 686–703 (1998)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Ariel Ruiz-Garcia
    • 1
  • Nicola Webb
    • 1
  • Vasile Palade
    • 1
  • Mark Eastwood
    • 1
  • Mark Elshaw
    • 1
  1. 1.School of Computing, Electronics and Mathematics, Faculty of Engineering, Environment and ComputingCoventry UniversityCoventryUK

Personalised recommendations