Advertisement

Human Activity Recognition with Convolutional Neural Networks

  • Antonio BevilacquaEmail author
  • Kyle MacDonald
  • Aamina Rangarej
  • Venessa Widjaya
  • Brian Caulfield
  • Tahar Kechadi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11053)

Abstract

The problem of automatic identification of physical activities performed by human subjects is referred to as Human Activity Recognition (HAR). There exist several techniques to measure motion characteristics during these physical activities, such as Inertial Measurement Units (IMUs). IMUs have a cornerstone position in this context, and are characterized by usage flexibility, low cost, and reduced privacy impact. With the use of inertial sensors, it is possible to sample some measures such as acceleration and angular velocity of a body, and use them to learn models that are capable of correctly classifying activities to their corresponding classes. In this paper, we propose to use Convolutional Neural Networks (CNNs) to classify human activities. Our models use raw data obtained from a set of inertial sensors. We explore several combinations of activities and sensors, showing how motion signals can be adapted to be fed into CNNs by using different network architectures. We also compare the performance of different groups of sensors, investigating the classification potential of single, double and triple sensor systems. The experimental results obtained on a dataset of 16 lower-limb activities, collected from a group of participants with the use of five different sensors, are very promising.

Keywords

Human activity recognition CNN Deep learning Classification IMU 

References

  1. 1.
    Abadi, M., et al.: TensorFlow: large-scale machine learning on heterogeneous systems (2015). https://www.tensorflow.org/
  2. 2.
    Alsheikh, M.A., Selim, A., Niyato, D., Doyle, L., Lin, S., Tan, H.P.: Deep activity recognition models with triaxial accelerometers. CoRR abs/1511.04664 (2015). http://arxiv.org/abs/1511.04664
  3. 3.
    Banos, O., Galvez, J.M., Damas, M., Pomares, H., Rojas, I.: Evaluating the effects of signal segmentation on activity recognition. In: International Work-Conference on Bioinformatics and Biomedical Engineering. IWBBIO 2014, pp. 759–765 (2014)Google Scholar
  4. 4.
    Bengio, Y.: Practical recommendations for gradient-based training of deep architectures. CoRR abs/1206.5533 (2012). http://arxiv.org/abs/1206.5533
  5. 5.
    Bengio, Y.: Deep learning of representations: looking forward. CoRR abs/1305.0445 (2013). http://arxiv.org/abs/1305.0445
  6. 6.
    Bulling, A., Blanke, U., Schiele, B.: A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. 46(3), 33:1–33:33 (2014).  https://doi.org/10.1145/2499621CrossRefGoogle Scholar
  7. 7.
    Burns, A., et al.: ShimmerTM a wireless sensor platform for noninvasive biomedical research. IEEE Sens. J. 10(9), 1527–1534 (2010).  https://doi.org/10.1109/JSEN.2010.2045498CrossRefGoogle Scholar
  8. 8.
    Cook, D., Feuz, K.D., Krishnan, N.C.: Transfer learning for activity recognition: a survey. Knowl. Inf. Syst. 36(3), 537–556 (2013).  https://doi.org/10.1007/s10115-013-0665-3CrossRefGoogle Scholar
  9. 9.
    Godfrey, A., Conway, R., Meagher, D., ÓLaighin, G.: Direct measurement of human movement by accelerometry. Med. Eng. Phys. 30, 1364–1386 (2009)CrossRefGoogle Scholar
  10. 10.
    Ha, S., Choi, S.: Convolutional neural networks for human activity recognition using multiple accelerometer and gyroscope sensors. In: 2016 International Joint Conference on Neural Networks (IJCNN), pp. 381–388, July 2016.  https://doi.org/10.1109/IJCNN.2016.7727224
  11. 11.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. CoRR abs/1412.6980 (2014). http://arxiv.org/abs/1412.6980
  12. 12.
    Thomas, S., Mackintosh, S., Halbert, J.: Does the ‘otago exercise programme’ reduce mortality and falls in older adults?: a systematic review and meta-analysis. Age Ageing 39(6), 681–687 (2010).  https://doi.org/10.1093/ageing/afq102CrossRefGoogle Scholar
  13. 13.
    Wang, J., Chen, Y., Hao, S., Peng, X., Hu, L.: Deep learning for sensor-based activity recognition: a survey. CoRR abs/1707.03502 (2017). http://arxiv.org/abs/1707.03502
  14. 14.
    Yang, J.B., Nguyen, M.N., San, P.P., Li, X.L., Krishnaswamy, S.: Deep convolutional neural networks on multichannel time series for human activity recognition. In: Proceedings of the 24th International Conference on Artificial Intelligence. IJCAI 2015, pp. 3995–4001. AAAI Press (2015). http://dl.acm.org/citation.cfm?id=2832747.2832806
  15. 15.
    Zeng, M., et al.: Convolutional neural networks for human activity recognition using mobile sensors. In: 6th International Conference on Mobile Computing, Applications and Services, pp. 197–205, November 2014.  https://doi.org/10.4108/icst.mobicase.2014.257786

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Antonio Bevilacqua
    • 1
    Email author
  • Kyle MacDonald
    • 2
  • Aamina Rangarej
    • 2
  • Venessa Widjaya
    • 2
  • Brian Caulfield
    • 1
  • Tahar Kechadi
    • 1
  1. 1.Insight Centre for Data AnalyticsUCDDublinIreland
  2. 2.School of Public Health, Physiotherapy and Sports ScienceUCDDublinIreland

Personalised recommendations