Abstract
Human activity recognition (HAR) system plays a very impactful role in providing precise and opportune information on people’s activities and behaviors. It is applicable in many diversified fields like surveillance, human–computer interaction, health care, entertainment and robotics. There are two main streams of HAR systems: vision-based systems and sensor-based systems. The vision-based systems use cameras to capture images or videos to recognize human behavior. But the use of cameras for capturing data is a difficult task due to limited coverage area background noise, viewpoint, lighting and appearance. On the other hand, the sensor-based systems use different types of easily available wearable devices, smart phones to capture the data for HAR. In this context, it would be great, if an AI-enabled HAR application can be explored for predicting different human activities, like walking, talking, standing, and sitting. They will even be additional targeted activities like those varieties of activities performed during a room or on a manufactory floor. The device knowledge could also be remotely recorded, like video, radar or alternative wireless ways. Alternately, knowledge could also be recorded directly on the topic like by carrying custom hardware or good phones that have accelerometers and gyroscopes.
Keywords
- Artificial intelligence
- Humanoid robot
- Human activity recognition
- Deep learning
- Neural network
This is a preview of subscription content, access via your institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Iosifidis, A., Tefas, A., Pitas, I.: Multi-view action recognition based on action volumes, fuzzy distances and cluster discriminant analysis. Signal Process. 93(6), 1445–1457 (2013)
Weinland, D., Ronfard, R., Boyer, E.: A survey of vision-based methods for action representation, segmentation and recognition. Comput. Vis. Image Underst. 115(2), 224–241 (2011)
Ali S., Shah, M.: Human action recognition in videos using kinematic features and multiple instance learning. IEEE Trans. Pattern Anal. Mach. Intell. 32(2), 288–303 (2010)
Poppe, R.: Vision-based human motion analysis: an overview. Comput. Vis. Image Underst. 108(1–2), 4–18 (2007)
Lukowicz, P., Ward, J.A., Junker, H., Stager, M., Tröoster, G., Atrash, A., Starner, T.: Recognizing workshop activity using body worn microphones and accelerometers. In: Proc. 2Nd Int Conf. Pervasive Comput., 18–22 (2004)
Karantonis, D.M., Narayanan, M.R., Mathie, M., Lovell, N.H., Celler, B.G.: Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring. IEEE Trans. Inf Technol. Biomed. 10(1), 156–167 (2006)
Nishkam, R., Nikhil, D., Preetham, M., Littman, M.L.: Activity recognition from accelerometer data. In: Proc. Seventeenth Conf. Innov. Appl. Artif. Intell., 1541–1546 (2005)
Foerster, F., Smeja, M., Fahrenberg, J.: Detection of posture and motion by accelerometry: a validation study in ambulatory monitoring. Comput. Hum. Behav. 15, 571–583 (1999)
Anjum, A., Ilyas, M.: Activity recognition using smartphone sensors. In: Proc. Consum. Commun. Netw. Conf., 914–919. Las Vegas, NV, USA, 11–14 January 2013
Chen, Y., Chen, C.: Performance analysis of smartphone-sensor behavior for human activity recognition. IEEE Access 5, 3095–3110 (2017)
Buenaventura, C., Tiglao, N.: Basic human activity recognition based on sensor fusion in smartphones. In: Proceedings of the integrated network and service management, pp. 1182–1185, Lisbon, Portugal, 8–12 May 2017
Chen, Z., Zhu, Q., Chai, S., Zhang, L.: Robust human activity recognition using smartphone sensors via ct-pca and online svm. IEEE Trans. Ind. Inform. 13, 3070–3080 (2017)
Bianchi, V., Guerra, C., Bassoli, M., De Munari, I., Ciampolini, P.: The HELICOPTER project: wireless sensor network for multi-user behavioral monitoring. In: Proc. Int. Conf. Eng. Technol. Innov. Eng. Technol. Innov. Manag. Beyond New Challenges New Approaches (ICE/ITMC), pp. 1445–1454 (2018, January)
Mukhopadhyay, S.C.: Wearable sensors for human activity monitoring: a review. IEEE Sensors J. 15(3), 1321–1330 (2015)
Kehler, D.S., et al.: A systematic review of the association between sedentary behaviors with frailty? Exp. Gerontol. 114 (December), 1–12 (2018)
Bassoli, M., Bianchi, V., Munari, I.: A plug and play IoT Wi-Fi smart home system for human monitoring. Electron. 7(9), 200 (2018)
Turaga, P., Chellappa, R., Subrahmanian, V.S., Udrea, O.: Machine recognition of human activities: a survey. IEEE Trans. Circuits Syst. Video Technol. 18(11), 1473–1488 (2008)
Cagnoni, S., Matrella, G., Mordonini, M., Sassi, F., Ascari, L.: Sensor fusion-oriented fall detection for assistive technologies applications. In: Proc. 9th Int. Conf. Intell. Syst. Design Appl. (ISDA), pp. 673–678. Pisa, Italy (2009)
Ugolotti, R., Sassi, F., Mordonini, M., Cagnoni, S.: Multi-sensor system for detection and classification of human activities. J. Ambient Intell. Humanized Comput. 4(1), 27–41 (2013)
Davis, K., et al.: Activity recognition based on inertial sensors for ambient assisted living. In: Proc. 19th Int. Conf. Inf. Fusion (FUSION), pp. 371–378. Heidelberg, Germany (2016)
Fonseca, C., Mendes, D., Lopes, M., Romao, A., Parreira, P.: Deep learning and IoT to assist multimorbidity home based healthcare. J. Health Med. Informat. 8(3), 1–4 (2017)
Köping, Lukas, Shirahama, Kimiaki, Grzegorzek, Marcin: A general framework for sensor-based human activity recognition. Comput. Biol. Med. 95, 248–260 (2018)
Pigou, L., Van Den Oord, A., Dieleman, S., Van Herreweghe, M., Dambre, J.: Beyond temporal pooling: recurrence and temporal convolutions for gesture recognition in video. Int. J. Comput. Vis. 126(2–4), 430–439 (2018)
Moeslund, T.B., Hilton, A., Krüger, V.: A survey of advances in vision-based human motion capture and analysis. Comput. Vis. Image Underst. 104, 90–126 (2006)
Fong, S., Liang, J., Fister, I., Fister, I., Mohammed, S.: Gesture recognition from data streams of human motion sensor using accelerated PSO swarm search feature selection algorithm. J. Sensors (March) (2015)
Krüger, V., Kragic, D., Ude, A., Geib, C.: The meaning of action: a review on action recognition and mapping. Adv. Robot. 21, 1473–1501 (2007)
Gharsalli, S., Emile, B., Laurent, H., Desquesnes, X., Vivet, D.: Random forest-based feature selection for emotion recognition. In: Proc. Int. Conf. Image Process. Theory Tools Appl. (IPTA), 268–272 (2015, November)
Mandal, S., Biswas, S., Balas, V.E., Shaw, R.N., Ghosh, A.: Motion prediction for autonomous vehicles from lyft dataset using deep learning. In: 2020 IEEE 5Th Int. Conf. Comput. Commun. Autom. (ICCCA), pp. 768–773, 30–31 October 2020. https://doi.org/10.1109/iccca49541.2020.9250790
Mandal, S., Balas, V.E., Shaw, R.N., Ghosh, A.: Prediction analysis of idiopathic pulmonary fibrosis progression from OSIC dataset. In: 2020 IEEE Int. Conf. Comput., Power Commun. Technol. (GUCON), pp. 861–865, 2–4 October 2020. https://doi.org/10.1109/gucon48875.2020.9231239
Link: https://archive.ics.uci.edu/ml/datasets/opportunity+activity+recognition
Link: https://archive.ics.uci.edu/ml/datasets/human+activity+recognition+using+smartphones
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Biswas, S., Islam, S.R. (2021). Machine Learning-Enabled Human Activity Recognition System for Humanoid Robot. In: Bianchini, M., Simic, M., Ghosh, A., Shaw, R.N. (eds) Machine Learning for Robotics Applications. Studies in Computational Intelligence, vol 960. Springer, Singapore. https://doi.org/10.1007/978-981-16-0598-7_2
Download citation
DOI: https://doi.org/10.1007/978-981-16-0598-7_2
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-0597-0
Online ISBN: 978-981-16-0598-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)