Dialogue Design for a Robot-Based Face-Mirroring Game to Engage Autistic Children with Emotional Expressions

  • Pauline ChevalierEmail author
  • Jamy J. Li
  • Eloise Ainger
  • Alyssa M. Alcorn
  • Snezana Babovic
  • Vicky Charisi
  • Suncica Petrovic
  • Bob R. Schadenberg
  • Elizabeth Pellicano
  • Vanessa Evers
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10652)


We present design strategies for Human Robot Interaction for school-aged autistic children with limited receptive language. Applying these strategies to the DE-ENIGMA project (large EU project addressing emotion recognition in autistic children) supported development of a new activity for in facial expression imitation whereby the robot imitates the child’s face to encourage the child to notice facial expressions in a play-based game. A usability case study with 15 typically-developing children aged 4–6 at an English-language school in the Netherlands was performed to observe the feasibility of the setup and make design revisions before exposing the robot to autistic children.


Robot Child Interaction Prototype Design Mirroring Imitation Autism 



This publication has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No. 688835 (DEENIGMA).


  1. 1.
    APA: Diagnostic and statistical manual of mental disorders (DSM-5®). American Psychiatric Pub. (2013)Google Scholar
  2. 2.
    Hart, M.: Autism/excel study. In: Proceedings of the 7th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 136–141 (2005)Google Scholar
  3. 3.
    Tapus, A., et al.: Children with autism social engagement in interaction with Nao, an imitative robot–A series of single case experiments. Interact. Stud. 13(3), 315–347 (2012)CrossRefMathSciNetGoogle Scholar
  4. 4.
    Dautenhahn, K., et al.: KASPAR–a minimally expressive humanoid robot for human–robot interaction research. Appl. Bionics Biomech. 6(3–4), 369–397 (2009)CrossRefGoogle Scholar
  5. 5.
    Vanderborght, B., et al.: Using the social robot probo as a social story telling agent for children with ASD. Interact. Stud. 13(3), 348–372 (2012)CrossRefGoogle Scholar
  6. 6.
    Milne, E.: Increased intra-participant variability in children with autistic spectrum disorders: evidence from single-trial analysis of evoked EEG. Front. Psychol. 2, 51 (2011)CrossRefGoogle Scholar
  7. 7.
    Robins, B., Dautenhahn, K.: Developing play scenarios for tactile interaction with a humanoid robot: a case study exploration with children with autism. In: Ge, S.S., Li, H., Cabibihan, J.-J., Tan, Y.K. (eds.) ICSR 2010. LNCS, vol. 6414, pp. 243–252. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-17248-9_25 CrossRefGoogle Scholar
  8. 8.
    Ferrari, E., Robins, B., Dautenhahn, K.: Therapeutic and educational objectives in robot assisted play for children with autism. In: The 18th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2009, pp. 108–114 (2009)Google Scholar
  9. 9.
    Escalona, A., Field, T., Nadel, J., Lundy, B.: Brief report: imitation effects on children with autism. J. Autism Dev. Disord. 32(2), 141–144 (2002)CrossRefGoogle Scholar
  10. 10.
    Howlin, P., Baron-Cohen, S., Hadwin, J.A.: Teaching Children with Autism to Mind-Read: A Practical Guide for Teachers and Parents. Wiley, Hoboken. Accessed 08 Jul 2017
  11. 11.
    Golan, O., et al.: Enhancing emotion recognition in children with autism spectrum conditions: an intervention using animated vehicles with real emotional faces. J. Autism Develop. disord. 40(3), 269–279 (2010)CrossRefGoogle Scholar
  12. 12.
    Pioggia, G., et al.: Human-robot interaction in autism: face, an android-based social therapy. In: The 16th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2007, pp. 605–612 (2007)Google Scholar
  13. 13.
    Pop, C.A., et al.: Can the social robot Probo help children with autism to identify situation-based emotions? A series of single case experiments. Int. J. Hum. Robot. 10(03), 1350025 (2013)CrossRefGoogle Scholar
  14. 14.
    Costa, S., Soares, F., Pereira, A.P., Santos, C., Hiolle, A.: A pilot study using imitation and storytelling scenarios as activities for labelling emotions by children with autism using a humanoid robot. In: 4th International Conference on Development and Learning and on Epigenetic Robotics, pp. 299–304 (2014)Google Scholar
  15. 15.
    Palestra, G., Varni, G., Chetouani, M., Esposito, F.: A Multimodal and multilevel system for robotics treatment of autism in children. In: Proceedings of the International Workshop on Social Learning and Multimodal Interaction for Designing Artificial Agents, New York, NY, USA, pp. 3:1–3:6 (2016)Google Scholar
  16. 16.
    de Hamilton, A.F.C.: Reflecting on the mirror neuron system in autism: a systematic review of current theories. Develop. Cognit. Neurosci. 3, 91–105, January 2013Google Scholar
  17. 17.
    Robins, B., Dickerson, P., Stribling, P., Dautenhahn, K.: Robot-mediated joint attention in children with autism: a case study in robot-human interaction. Interact. Stud. 5(2), 161–198 (2004)CrossRefGoogle Scholar
  18. 18.
    Duquette, A., Michaud, F., Mercier, H.: Exploring the use of a mobile robot as an imitation agent with children with low-functioning autism. Auton. Robots 24(2), 147–157 (2008)CrossRefGoogle Scholar
  19. 19.
    Chevalier, P., Raiola, G., Martin, J.-C., Isableu, B., Bazile, C., Tapus, A.: Do sensory preferences of children with autism impact an imitation task with a robot? In: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, pp. 177–186 (2017)Google Scholar
  20. 20.
    Deriso, D.M., Susskind, J., Tanaka, J., Winkielman, P., Herrington, J., Schultz, R., Bartlett, M.: Exploring the facial expression perception-production link using real-time automated facial expression recognition. In: Fusiello, A., Murino, V., Cucchiara, R. (eds.) ECCV 2012. LNCS, vol. 7584, pp. 270–279. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-33868-7_27 CrossRefGoogle Scholar
  21. 21.
    Silva, V., Soares, F., Esteves, J.S.: Mirroring and recognizing emotions through facial expressions for a RoboKind platform. In: 2017 IEEE 5th Portuguese Meeting on Bioengineering (ENBENG), pp. 1–4 (2017)Google Scholar
  22. 22.
    ter Maat, M., Heylen, D.: Flipper: an information state component for spoken dialogue systems. In: Vilhjálmsson, H.H., Kopp, S., Marsella, S., Thórisson, Kristinn R. (eds.) IVA 2011. LNCS, vol. 6895, pp. 470–472. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-23974-8_67 CrossRefGoogle Scholar
  23. 23.
    Reidsma, D., van Welbergen, H.: AsapRealizer in practice – A modular and extensible architecture for a BML Realizer. Entertainment Computing 4(3), 157–169 (2013)CrossRefGoogle Scholar
  24. 24.
    Vouloutsi, V., et al.: Towards a synthetic tutor assistant: the EASEL project and its architecture. In: Lepora, N.F.F., Mura, A., Mangan, M., Verschure, P.F.M.J.F.M.J., Desmulliez, M., Prescott, T.J. (eds.) Living Machines 2016. LNCS, vol. 9793, pp. 353–364. Springer, Cham (2016). doi: 10.1007/978-3-319-42417-0_32 CrossRefGoogle Scholar
  25. 25.
    Asthana, A., Zafeiriou, S., Tzimiropoulos, G., Cheng, S., Pantic, M.: From pixels to response maps: discriminative image filtering for face alignment in the wild. IEEE Trans. Pattern Anal. Mach. Intell. 37(6), 1312–1320 (2015)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Pauline Chevalier
    • 1
    Email author
  • Jamy J. Li
    • 1
  • Eloise Ainger
    • 2
  • Alyssa M. Alcorn
    • 2
  • Snezana Babovic
    • 3
  • Vicky Charisi
    • 1
  • Suncica Petrovic
    • 3
  • Bob R. Schadenberg
    • 1
  • Elizabeth Pellicano
    • 2
  • Vanessa Evers
    • 1
  1. 1.Human Media InteractionUniversity of TwenteEnschedeThe Netherlands
  2. 2.Centre for Research in Autism and EducationUCL Institute of EducationLondonUK
  3. 3.Serbian Society of AutismBelgradeSerbia

Personalised recommendations