Advertisement

MuDERI: Multimodal Database for Emotion Recognition Among Intellectually Disabled Individuals

  • Jainendra Shukla
  • Miguel Barreda-Ángeles
  • Joan Oliver
  • Domènec Puig
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9979)

Abstract

Social robots with empathic interaction is a crucial requirement towards deliverance of an effective cognitive stimulation among individuals with Intellectual Disability (ID) and has been challenged by absence of any particular database. Project REHABIBOTICS presents a first ever multimodal database of individuals with ID, recorded in a nearly real world settings for analysis of human affective states. MuDERI is an annotated multimodal database of audiovisual recordings, RGB-D videos and physiological signals of 12 participants in actual settings, which were recorded as participants were elicited using personalized real world objects and/or activities. The database is publicly available.

Keywords

Socially assistive robotics Intellectual disability Robot assisted therapy SAR RAT ID 

References

  1. 1.
    Rabbitt, S.M., Kazdin, A.E., Scassellati, B.: Integrating socially assistive robotics into mental healthcare interventions: Applications and recommendations for expanded use. Clin. Psychol. Rev. 35, 35–46 (2015)CrossRefGoogle Scholar
  2. 2.
    Shukla, J., Cristiano, J., Amela, D., Anguera, L., Vergés-Llahí, J., Puig, D.: A case study of robot interaction among individuals with profound and multiple learning disabilities. In: Tapus, A., André, E., Martin, J.-C., Ferland, F., Ammi, M. (eds.) Social Robotics. LNCS, vol. 9388, pp. 613–622. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-25554-5_61 CrossRefGoogle Scholar
  3. 3.
    Shukla, J., Cristiano, J., Anguera, L., Vergés-Llahí, J., Puig, D.: A comparison of robot interaction with tactile gaming console stimulation in clinical applications. In: Reis, L.P., Moreira, A.P., Lima, P.U., Montano, L., Muñoz-Martinez, V. (eds.) Robot 2015: Second Iberian Robotics Conference. AISC, vol. 418, pp. 435–445. Springer, Heidelberg (2016). doi: 10.1007/978-3-319-27149-1_34 CrossRefGoogle Scholar
  4. 4.
    Soleymani, M., Lichtenauer, J., Pun, T., et al.: A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3, 42–55 (2012)CrossRefGoogle Scholar
  5. 5.
    Koelstra, S., Muhl, C., Soleymani, M., et al.: DEAP: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3, 18–31 (2012)CrossRefGoogle Scholar
  6. 6.
    McKeown, G., Valstar, M., Cowie, R., et al.: The SEMAINE database: annotated multimodal records of emotionally colored conversations between a person and a limited agent. IEEE Trans. Affect. Comput. 3, 5–17 (2012)CrossRefGoogle Scholar
  7. 7.
    Sneddon, I., McRorie, M., McKeown, G., Hanratty, J.: The belfast induced natural emotion database. IEEE Trans. Affect. Comput. 3, 32–41 (2012)CrossRefGoogle Scholar
  8. 8.
    Ünal, Ö., Özcan, Ö., Öner, Ö., et al.: EEG and MRI findings and their relation with intellectual disability in pervasive developmental disorders. World J. Pediatr. 5, 196–200 (2009)CrossRefGoogle Scholar
  9. 9.
    Bradley, M.M., Lang, P.J.: Emotion and Motivation. Handbook of Psychophysiology. Cambridge University Press, New York (2007)CrossRefGoogle Scholar
  10. 10.
    Ekman, P.: An argument for basic emotions. Cogn. Emotion. 6, 169–200 (1992)CrossRefGoogle Scholar
  11. 11.
    Mauss, I.B., Robinson, M.D.: Measures of emotion: A review. Cogn Emot. 23, 209–237 (2009)CrossRefGoogle Scholar
  12. 12.
    Barreda-Ángeles, M., Pépion, R., Bosc, E., et al.: Exploring the effects of 3D visual discomfort on viewers emotions. In: 2014 IEEE International Conference on Image Processing (ICIP), pp. 753–757 (2014)Google Scholar
  13. 13.
    Barreda-Ángeles, M., Arapakis, I., Bai, X., et al.: Unconscious physiological effects of search latency on users and their click behaviour. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 203–212 (2015)Google Scholar
  14. 14.
    Swangnetr, M., Kaber, D.B.: Emotional state classification in patient robot interaction using wavelet analysis and statistics-based feature selection. IEEE Trans. Hum. Mach. Syst. 43, 63–75 (2013)CrossRefGoogle Scholar
  15. 15.
    Karg, M., Samadani, A.A., Gorbet, R., et al.: Body movements for affective expression: a survey of automatic recognition and generation. IEEE Trans. Affect. Computing. 4, 341–359 (2013)CrossRefGoogle Scholar
  16. 16.
    Morris, J.D.: OBSERVATIONS: SAM: the self-assessment manikin-an efficient cross-cultural measurement of emotional response. J. Advertising Res. 35, 38–63 (1995)Google Scholar
  17. 17.
    Bowling, A.: Research Methods in Health: Investigating Health and Health Services. Open University Press, Buckingham (2009)Google Scholar
  18. 18.
    Delorme, A., Makeig, S.: EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods. 134, 9–21 (2004)CrossRefGoogle Scholar
  19. 19.
    Gomez-Herrero, G., Clercq, W.D., Anwar, H., et al.: Automatic removal of ocular artifacts in the EEG without an EOG reference channel. In: Proceedings of the 7th Nordic Signal Processing Symposium - NORSIG 2006, pp. 130–133 (2006)Google Scholar
  20. 20.
    Benedek, M., Kaernbach, C.: A continuous measure of phasic electrodermal activity. J. Neurosci. Methods 190, 80–91 (2010)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Jainendra Shukla
    • 1
    • 3
  • Miguel Barreda-Ángeles
    • 2
  • Joan Oliver
    • 1
  • Domènec Puig
    • 3
  1. 1.Instituto de Robótica para la DependenciaSitgesSpain
  2. 2.Eurecat, Technology Centre of CataloniaBarcelonaSpain
  3. 3.Intelligent Robotics and Computer Vision GroupUniversitat Rovira i VirgiliTarragonaSpain

Personalised recommendations