Advertisement

Collaborative HRI and Machine Learning for Constructing Personalised Physical Exercise Databases

  • Daniel Delgado BellamyEmail author
  • Praminda Caleb-SollyEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11649)

Abstract

Recent demographics indicate that we have a growing population of older adults with increasingly complex care-related needs, and a shrinking care workforce with limited resources to support them. As a result, there are a large number of research initiatives investigating the potential of intelligent robots in a domestic environment to augment the support care-givers can provide and improve older adults’ well-being, particularly by motivating them in staying fit and healthy through exercise. In this paper, we propose a robot-based coaching system which encourages collaboration with the user to collect person-specific exercise-related movement data. The aim is to personalise the experience of exercise sessions and provide directed feedback to the user to help improve their performance. The way each individual user is likely to perform specific movements will be based on their personal ability and range of motion, and it is important for a coaching system to recognise the movements and map the feedback to the user accordingly. We show how a machine learning technique, a Nearest Neighbour classifier enhanced with a confidence metric, is used to build a personalised database of 3D skeletal tracking data. This approach, combined with collaborative Human-Robot Interaction to collect the data, could be used for robust and adaptable exercise performance tracking by a collaborative robot coach, using the information to provide personalised feedback.

Keywords

Human-Robot Interaction Robot coaching Assistive robots 

References

  1. Bianchi-Berthouze, N.: Understanding the role of body movement in player engagement. Hum.-Comput. Interact. 28(1), 40–75 (2013)Google Scholar
  2. Burns, A.: Old age psychiatrist newsletter of the old age faculty of the Royal College of Psychiatrists. Technical report. Royal College of Psychiatrists (2016)Google Scholar
  3. Cavallo, F., et al.: On the design, development and experimentation of the ASTRO assistive robot integrated in smart environments. In: 2013 IEEE International Conference on Robotics and Automation, pp. 4310–4315. IEEE (2013)Google Scholar
  4. Coombs, N.: Estimates of the very old (including centenarians) - Office for National Statistics. Technical report. Office of National Statistics (2017)Google Scholar
  5. Krüger, V., Kragic, D., Ude, A., Geib, C.: The meaning of action: a review on action recognition and mapping. Adv. Robot. 21(13), 1473–1501 (2007)Google Scholar
  6. Lara, O.D., Labrador, M.A., et al.: A survey on human activity recognition using wearable sensors. IEEE Commun. Surv. Tutorials 15(3), 1192–1209 (2013)CrossRefGoogle Scholar
  7. Lo Presti, L., La Cascia, M.: 3D skeleton-based human action classification: a survey. Pattern Recogn. 53, 130–147 (2016)CrossRefGoogle Scholar
  8. Maharani, D.A., Fakhrurroja, H., Machbub, C.: Hand gesture recognition using k-means clustering and support vector machine. In: 2018 IEEE Symposium on Computer Applications Industrial Electronics (ISCAIE), pp. 1–6 (2018)Google Scholar
  9. Martins, G., Santos, L., Dias, J.: The GrowMeUp project and the applicability of action recognition techniques. In: Third Workshop on Recognition and Action for Scene Understanding (REACTS) (2015). mrl.isr.uc.pt
  10. Mitchell, M.: One voice: shaping our ageing society. Br. J. Community Nurs. 14(6), 259–261 (2009)CrossRefGoogle Scholar
  11. Moeslund, T.B., Hilton, A., Krüger, V.: A survey of advances in vision-based human motion capture and analysis. Technical report (2006)Google Scholar
  12. Nani, M., et al.: MOBISERV: an integrated intelligent home environment for the provision of health, nutrition and mobility services to the elderly (2010)Google Scholar
  13. Ofcom: Television and audio-visual content - media consumption statistics. Technical report. Ofcom, London (2017)Google Scholar
  14. Park, N.: Population estimates for the UK, England and Wales, Scotland and Northern Ireland - Office for National Statistics. Technical report. Office for National Statistics (UK) (2018)Google Scholar
  15. Pham, M., Mengistu, Y., Do, H., Sheng, W.: Delivering home healthcare through a Cloud-based Smart Home Environment (CoSHE). Future Gener. Comput. Syst. 81, 129–140 (2018)CrossRefGoogle Scholar
  16. Pigini, L., Facal, D., Garcia, A., Burmester, M., Andrich, R.: The proof of concept of a shadow robotic system for independent living at home. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012. LNCS, vol. 7382, pp. 634–641. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-31522-0_96CrossRefGoogle Scholar
  17. Poppe, R.: A survey on vision-based human action recognition. Image Vis. Comput. 28(6), 976–990 (2010)CrossRefGoogle Scholar
  18. Ros, R., Demiris, Y.: Creative dance: an approach for social interaction between robots and children. In: Salah, A.A., Hung, H., Aran, O., Gunes, H. (eds.) HBU 2013. LNCS, vol. 8212, pp. 40–51. Springer, Cham (2013).  https://doi.org/10.1007/978-3-319-02714-2_4CrossRefGoogle Scholar
  19. Saunders, J., Burke, N., Koay, K.L., Dautenhahn, K.: A user friendly robot architecture for re-ablement and co-learning in a sensorised home (2013). books.google.com
  20. Sharkey, A., Sharkey, N.: Granny and the robots: ethical issues in robot care for the elderly. Ethics Inf. Technol. 14(1), 27–40 (2012)CrossRefGoogle Scholar
  21. United Nations: World Population Prospects. Technical report. United Nations, New York (2017)Google Scholar
  22. Wang, L., Hu, W., Tan, T.: Recent developments in human motion analysis. Pattern Recogn. 36, 585–601 (2003)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Bristol Robotics LaboratoryUniversity of West EnglandBristolUK

Personalised recommendations