Identifying Engagement from Joint Kinematics Data for Robot Therapy Prompt Interventions for Children with Autism Spectrum Disorder

  • Bi Ge
  • Hae Won Park
  • Ayanna M. HowardEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9979)


Prompts are used by therapists to help children with autism spectrum disorder learn and acquire desirable skills and behaviors. As social robots are more regularly translated into similar therapy settings, a critical part of ensuring effectiveness of these robot therapy system is providing them with the ability to detect engagement/disengagement states of the child in order to provide prompts at the right time. In this paper, we examine the various features related to body movement that can be utilized to define engagement levels and develop a model using these features for identifying engagement/disengagement states. The model was validated in a pilot study with child participants. Results show that our engagement model can achieve a recognition rate of 97 %.


Robot therapy Special needs Kinematic assessment 


  1. 1.
    Corsello, C.M.: Early intervention in autism. Infants Young Child. 18, 74–85 (2005)CrossRefGoogle Scholar
  2. 2.
    Granpeesheh, D., Dixon, D.R., Tarbox, J., Kaplan, A.M., Wilke, A.E.: The effects of age and treatment intensity on behavioral intervention outcomes for children with autism spectrum disorders. Res. Autism Spectr. Disord. 3, 1014–1022 (2009)CrossRefGoogle Scholar
  3. 3.
    Sharpe, D.L., Baker, D.L.: Financial issues associated with having a child with autism. J. Fam. Econ. Issues 28, 247–264 (2007)CrossRefGoogle Scholar
  4. 4.
    Robins, B., Dautenhahn, K., Te Boekhorst, R., Billard, A.: Robot assistants in therapy and education of children with autism: can a small humanoid robot help encourage social interaction skills. Univ. Access Inf. Soc. 4, 105–120 (2005)CrossRefGoogle Scholar
  5. 5.
    MacDuff, G.S., Krantz, P.J., McClannahan, L.E.: Prompts and Prompt-Fading Strategies for People with Autism, Making a difference: Behavioral intervention for autism, Austin. TX, Pro-Ed (2001)Google Scholar
  6. 6.
    Bekele, E., et al.: A step towards developing adaptive robot-mediated intervention architecture (ARIA) for children With Autism. IEEE Trans. Neural Syst. Rehabil. Eng. 21, 289–299 (2013)CrossRefGoogle Scholar
  7. 7.
    Eikeseth, S., Smith, T., Jahr, E., Eldevik, S.: Outcome for children with autism who began intensive behavioral treatment between ages 4 and 7. Behav. Modif. 31, 264–278 (2007)CrossRefGoogle Scholar
  8. 8.
    Leite, I., McCoy, M., Ullman, D., Salomons, N., Scassellati, B.: Comparing models of disengagement in individual and group interactions. In: ACM/IEEE International Conference on Human-Robot Interaction, Portland, Oregon, pp. 99–105 (2014)Google Scholar
  9. 9.
    Castellano, G., Pereira, A., Leite, I., Paiva, A., McOwan, P.W.: Detecting user engagement with a robot companion using task and social interaction-based features. In: International Conference on Multimodal Interfaces, Cambridge, Massachusetts, pp. 119–126 (2009)Google Scholar
  10. 10.
    Berka, C., et al.: EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. Aviat. Space Environ. Med. 78, 231–244 (2007)Google Scholar
  11. 11.
    Michalowski, M.P., Sabanovic, S., Simmons, R.: A spatial model of engagement for a social robot. In: 9th IEEE International Workshop on Advanced Motion Control, pp. 762–767, Istanbul, Turkey (2006)Google Scholar
  12. 12.
    Nakano, Y.; Ishii, R.: Estimating user’s engagement from eye-gaze behaviors in human-agent conversations. In: Proceedings of the 15th International Conference on Intelligent User Interfaces, Hong Kong, China, pp. 139–148 (2010)Google Scholar
  13. 13.
    Bal, E., et al.: Emotion recognition in children with autism spectrum disorders: relations to eye gaze and autonomic state. J. Autism Dev. Disord. 40, 358–370 (2010)CrossRefGoogle Scholar
  14. 14.
    Park, H.W.; Howard, A.: Engaging children in social behavior: interaction with a robot playmate through tablet-based apps. In: Rehabilitation Eng. and Technology Society of North America (RESNA) Annual Conference, Indianapolis, IN, June 2014Google Scholar
  15. 15.
    Yu, C., Aoki, P.M., Woodruff, A.: Detecting User Engagement in Everyday Conversations. arXiv preprint cs/0410027 (2004)Google Scholar
  16. 16.
    Park, H.W., Coogle, R., Howard A.: Using a shared tablet workspace for interactive demonstrations during human-robot learning scenarios. In: IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, June 2014Google Scholar
  17. 17.
    Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning. Springer, New York (2001)CrossRefzbMATHGoogle Scholar
  18. 18.
    Pedregosa, F., et al.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.School of Electrical and Computer EngineeringGeorgia Institute of TechnologyAtlantaGeorgia

Personalised recommendations