Skip to main content

Machine Learning-Enabled Human Activity Recognition System for Humanoid Robot

  • 922 Accesses

Part of the Studies in Computational Intelligence book series (SCI,volume 960)

Abstract

Human activity recognition (HAR) system plays a very impactful role in providing precise and opportune information on people’s activities and behaviors. It is applicable in many diversified fields like surveillance, human–computer interaction, health care, entertainment and robotics. There are two main streams of HAR systems: vision-based systems and sensor-based systems. The vision-based systems use cameras to capture images or videos to recognize human behavior. But the use of cameras for capturing data is a difficult task due to limited coverage area background noise, viewpoint, lighting and appearance. On the other hand, the sensor-based systems use different types of easily available wearable devices, smart phones to capture the data for HAR. In this context, it would be great, if an AI-enabled HAR application can be explored for predicting different human activities, like walking, talking, standing, and sitting. They will even be additional targeted activities like those varieties of activities performed during a room or on a manufactory floor. The device knowledge could also be remotely recorded, like video, radar or alternative wireless ways. Alternately, knowledge could also be recorded directly on the topic like by carrying custom hardware or good phones that have accelerometers and gyroscopes.

Keywords

  • Artificial intelligence
  • Humanoid robot
  • Human activity recognition
  • Deep learning
  • Neural network

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Iosifidis, A., Tefas, A., Pitas, I.: Multi-view action recognition based on action volumes, fuzzy distances and cluster discriminant analysis. Signal Process. 93(6), 1445–1457 (2013)

    Google Scholar 

  2. Weinland, D., Ronfard, R., Boyer, E.: A survey of vision-based methods for action representation, segmentation and recognition. Comput. Vis. Image Underst. 115(2), 224–241 (2011)

    Google Scholar 

  3. Ali S., Shah, M.: Human action recognition in videos using kinematic features and multiple instance learning. IEEE Trans. Pattern Anal. Mach. Intell. 32(2), 288–303 (2010)

    Google Scholar 

  4. Poppe, R.: Vision-based human motion analysis: an overview. Comput. Vis. Image Underst. 108(1–2), 4–18 (2007)

    CrossRef  Google Scholar 

  5. Lukowicz, P., Ward, J.A., Junker, H., Stager, M., Tröoster, G., Atrash, A., Starner, T.: Recognizing workshop activity using body worn microphones and accelerometers. In: Proc. 2Nd Int Conf. Pervasive Comput., 18–22 (2004)

    Google Scholar 

  6. Karantonis, D.M., Narayanan, M.R., Mathie, M., Lovell, N.H., Celler, B.G.: Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring. IEEE Trans. Inf Technol. Biomed. 10(1), 156–167 (2006)

    CrossRef  Google Scholar 

  7. Nishkam, R., Nikhil, D., Preetham, M., Littman, M.L.: Activity recognition from accelerometer data. In: Proc. Seventeenth Conf. Innov. Appl. Artif. Intell., 1541–1546 (2005)

    Google Scholar 

  8. Foerster, F., Smeja, M., Fahrenberg, J.: Detection of posture and motion by accelerometry: a validation study in ambulatory monitoring. Comput. Hum. Behav. 15, 571–583 (1999)

    CrossRef  Google Scholar 

  9. Anjum, A., Ilyas, M.: Activity recognition using smartphone sensors. In: Proc. Consum. Commun. Netw. Conf., 914–919. Las Vegas, NV, USA, 11–14 January 2013

    Google Scholar 

  10. Chen, Y., Chen, C.: Performance analysis of smartphone-sensor behavior for human activity recognition. IEEE Access 5, 3095–3110 (2017)

    CrossRef  Google Scholar 

  11. Buenaventura, C., Tiglao, N.: Basic human activity recognition based on sensor fusion in smartphones. In: Proceedings of the integrated network and service management, pp. 1182–1185, Lisbon, Portugal, 8–12 May 2017

    Google Scholar 

  12. Chen, Z., Zhu, Q., Chai, S., Zhang, L.: Robust human activity recognition using smartphone sensors via ct-pca and online svm. IEEE Trans. Ind. Inform. 13, 3070–3080 (2017)

    CrossRef  Google Scholar 

  13. Bianchi, V., Guerra, C., Bassoli, M., De Munari, I., Ciampolini, P.: The HELICOPTER project: wireless sensor network for multi-user behavioral monitoring. In: Proc. Int. Conf. Eng. Technol. Innov. Eng. Technol. Innov. Manag. Beyond New Challenges New Approaches (ICE/ITMC), pp. 1445–1454 (2018, January)

    Google Scholar 

  14. Mukhopadhyay, S.C.: Wearable sensors for human activity monitoring: a review. IEEE Sensors J. 15(3), 1321–1330 (2015)

    Google Scholar 

  15. Kehler, D.S., et al.: A systematic review of the association between sedentary behaviors with frailty? Exp. Gerontol. 114 (December), 1–12 (2018)

    Google Scholar 

  16. Bassoli, M., Bianchi, V., Munari, I.: A plug and play IoT Wi-Fi smart home system for human monitoring. Electron. 7(9), 200 (2018)

    Google Scholar 

  17. Turaga, P., Chellappa, R., Subrahmanian, V.S., Udrea, O.: Machine recognition of human activities: a survey. IEEE Trans. Circuits Syst. Video Technol. 18(11), 1473–1488 (2008)

    Google Scholar 

  18. Cagnoni, S., Matrella, G., Mordonini, M., Sassi, F., Ascari, L.: Sensor fusion-oriented fall detection for assistive technologies applications. In: Proc. 9th Int. Conf. Intell. Syst. Design Appl. (ISDA), pp. 673–678. Pisa, Italy (2009)

    Google Scholar 

  19. Ugolotti, R., Sassi, F., Mordonini, M., Cagnoni, S.: Multi-sensor system for detection and classification of human activities. J. Ambient Intell. Humanized Comput. 4(1), 27–41 (2013)

    CrossRef  Google Scholar 

  20. Davis, K., et al.: Activity recognition based on inertial sensors for ambient assisted living. In: Proc. 19th Int. Conf. Inf. Fusion (FUSION), pp. 371–378. Heidelberg, Germany (2016)

    Google Scholar 

  21. Fonseca, C., Mendes, D., Lopes, M., Romao, A., Parreira, P.: Deep learning and IoT to assist multimorbidity home based healthcare. J. Health Med. Informat. 8(3), 1–4 (2017)

    Google Scholar 

  22. Köping, Lukas, Shirahama, Kimiaki, Grzegorzek, Marcin: A general framework for sensor-based human activity recognition. Comput. Biol. Med. 95, 248–260 (2018)

    CrossRef  Google Scholar 

  23. Pigou, L., Van Den Oord, A., Dieleman, S., Van Herreweghe, M., Dambre, J.: Beyond temporal pooling: recurrence and temporal convolutions for gesture recognition in video. Int. J. Comput. Vis. 126(2–4), 430–439 (2018)

    CrossRef  MathSciNet  Google Scholar 

  24. Moeslund, T.B., Hilton, A., Krüger, V.: A survey of advances in vision-based human motion capture and analysis. Comput. Vis. Image Underst. 104, 90–126 (2006)

    CrossRef  Google Scholar 

  25. Fong, S., Liang, J., Fister, I., Fister, I., Mohammed, S.: Gesture recognition from data streams of human motion sensor using accelerated PSO swarm search feature selection algorithm. J. Sensors (March) (2015)

    Google Scholar 

  26. Krüger, V., Kragic, D., Ude, A., Geib, C.: The meaning of action: a review on action recognition and mapping. Adv. Robot. 21, 1473–1501 (2007)

    CrossRef  Google Scholar 

  27. Gharsalli, S., Emile, B., Laurent, H., Desquesnes, X., Vivet, D.: Random forest-based feature selection for emotion recognition. In: Proc. Int. Conf. Image Process. Theory Tools Appl. (IPTA), 268–272 (2015, November)

    Google Scholar 

  28. Mandal, S., Biswas, S., Balas, V.E., Shaw, R.N., Ghosh, A.: Motion prediction for autonomous vehicles from lyft dataset using deep learning. In: 2020 IEEE 5Th Int. Conf. Comput. Commun. Autom. (ICCCA), pp. 768–773, 30–31 October 2020. https://doi.org/10.1109/iccca49541.2020.9250790

  29. Mandal, S., Balas, V.E., Shaw, R.N., Ghosh, A.: Prediction analysis of idiopathic pulmonary fibrosis progression from OSIC dataset. In: 2020 IEEE Int. Conf. Comput., Power Commun. Technol. (GUCON), pp. 861–865, 2–4 October 2020. https://doi.org/10.1109/gucon48875.2020.9231239

  30. Link: https://archive.ics.uci.edu/ml/datasets/opportunity+activity+recognition

  31. Link: https://archive.ics.uci.edu/ml/datasets/human+activity+recognition+using+smartphones

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Biswas, S., Islam, S.R. (2021). Machine Learning-Enabled Human Activity Recognition System for Humanoid Robot. In: Bianchini, M., Simic, M., Ghosh, A., Shaw, R.N. (eds) Machine Learning for Robotics Applications. Studies in Computational Intelligence, vol 960. Springer, Singapore. https://doi.org/10.1007/978-981-16-0598-7_2

Download citation