Skip to main content
Log in

Performance-enhanced real-time lifestyle tracking model based on human activity recognition (PERT-HAR) model through smartphones

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

The identification of human actions and their representation and categorization in an automated system through training and learning is considered the human activity recognition (HAR) process. Tracking systems capture and read human actions ranging from the identification of plain movements to the comprehension of habits and practices. As such, HARs have found their use in areas such as health care, with a special focus on elderly patients’ care, safety arrangements and supervision areas and in applications designed for smart homes. Sensor and visual devices enable HAR, and there is a multitude of sensor classifications, such as sensors that can be worn, sensors tagged to a target and sensors tagged to the background. The automated learning methodologies in HAR are either handcrafted or deep learning or a combination of both. Handcrafted models can be regional or wholesome recognition models such as RGB, 3D mapping and skeleton data models, and deep learning models are categorized into generative models such as LSTM (long short-term memory), discriminative models such as convolutional neural networks (CNNs) or a synthesis of such models. Several datasets are available for undertaking HAR analysis and representation. The hierarchy of processes in HAR is classified into gathering information, preliminary processing, property derivation and guiding based on framed models. The proposed study considers the role of smartphones in HARs with a particular interest in keeping a tab on the lifestyle of subjects. Smartphones act as HAR devices with inbuilt sensors with custom-made applications, and the merits of both handcrafted and deep learning models are considered in framing a model that can enable lifestyle tracking in real time. This performance-enhanced real-time tracking human activity recognition (PERT-HAR) model is economical and effective in accurate identification and representation of actions of the subjects and thereby provides more accurate data for real-time investigation and remedial measures. This model achieves an accuracy of 97–99% in a properly controlled environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25

Similar content being viewed by others

References

  1. Dang LM, et al (2019) Face image manipulation detection based on a convolutional neural network. Expert Syst Appl 129:156–168

    Article  Google Scholar 

  2. Shotton J, et al (2013) Real-time human pose recognition in parts from single depth images. Commun ACM 56(1):116–124

    Article  Google Scholar 

  3. Shih-En W, et al. (2016) Convolutional pose machines. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition

  4. Jianan Z, et al. (2020) SMAP: single-shot multi-person absolute 3D pose estimation. European Conference on Computer Vision. Springer, Cham

  5. James DW, Bobick AF (2017) The representation and recognition of human movement using temporal templates. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE

  6. Bingbing N, Wang G, Pierre Moulin P (2011) Rgbd-hudaact: A color-depth video database for human daily activity recognition. 2011 IEEE International Conference on Computer Vision Workshops (ICCV workshops). IEEE

  7. Li G, Li C (2020) Learning skeleton information for human action analysis using Kinect. Signal Proces Image Commun 84:115814

    Article  Google Scholar 

  8. Csurka G, Dance CR, Humenberger M (2018) From handcrafted to deep local features. arXiv preprint ar Xiv:1807.10254

  9. Yang Y et al (2019) Building an effective intrusion detection system using the modified density peak clustering algorithm and deep belief networks. Appl Sci 9(2):238

    Article  Google Scholar 

  10. Baker N et al (2020) Local features and global shape information in object classification by deep convolutional neural networks. Vis Res 172:46–61

    Article  Google Scholar 

  11. Rahmani H, Bennamoun M (2017) Learning action recognition model from depth and skeleton videos. In: Proceedings of the IEEE International Conference on Computer Vision

  12. Naeem Hajra Binte et al (2020) Multiple batches of motion history images (MB-MHIs) for multi-view human action recognition. Arab J Sci Eng 45(8):6109–6124

    Article  Google Scholar 

  13. Wu Z, et al. (2015) Modeling spatial-temporal clues in a hybrid deep learning framework for video classification. In: Proceedings of the 23rd ACM international Conference on Multimedia

  14. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  15. Balderas D, Ponce P, Molina A (2019) Convolutional long short term memory deep neural networks for image sequence prediction. Expert Syst Appl 122:152–162

    Article  Google Scholar 

  16. Zhang Y, Zhang L (2017) WiFi-based contactless activity recognition on smartphones. 2017 IEEE/CIC International Conference on Communications in China (ICCC). IEEE

  17. Shoaib M et al (2015) A survey of online activity recognition using mobile phones. Sensors 15(1):2059–2085

    Article  Google Scholar 

  18. Ronao CA, Cho S-B (2016) Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst Appl 59:235–244

    Article  Google Scholar 

  19. Tsinganos P, Skodras A (2017) A smartphone-based fall detection system for the elderly. Proceedings of the 10th International Symposium on Image and Signal Processing and Analysis. IEEE

  20. Arif M et al (2014) Better physical activity classification using smartphone acceleration sensor. J Med Syst 38(9):1–10

    Article  Google Scholar 

  21. Martín H et al (2013) Activity logging using lightweight classification techniques in mobile devices. Pers Ubiquitous Comput 17(4):675–695

    Article  Google Scholar 

  22. Guiry JJ et al (2014) Activity recognition with smartphone support. Med Eng Phys 36(6):670–675

    Article  Google Scholar 

  23. Wesllen SL et al (2019) Human activity recognition using inertial sensors in a smartphone: an overview. Sensors 19(14):3213

    Article  Google Scholar 

  24. Zhao K, et al. (2019) Optimizing the f-measure for threshold-free salient object detection. Proceedings of the IEEE/CVF International Conference on Computer Vision

  25. Ignatov A (2018) Real-time human activity recognition from accelerometer data using convolutional neural networks. Appl Soft Comput 62:915–922

    Article  Google Scholar 

  26. Nweke HF et al (2018) Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: state of the art and research challenges. Expert Syst Appl 105:233–261

    Article  Google Scholar 

  27. Wang J et al (2019) Deep learning for sensor-based activity recognition: a survey. Pattern Recognit Lett 119:3–11

    Article  Google Scholar 

  28. Almaslukh B, Muhtadi JA, Artoli AM (2018) A robust convolutional neural network for online smartphone-based human activity recognition. J Intell Fuzzy Syst 35(2):1609–1620

    Article  Google Scholar 

  29. Yao R et al (2018) Efficient dense labelling of human activity sequences from wearables using fully convolutional networks. Pattern Recognit 78:252–266

    Article  Google Scholar 

  30. Kautz T et al (2017) Activity recognition in beach volleyball using a deep convolutional neural network. Data Min Knowl Discov 31(6):1678–1705

    Article  MathSciNet  Google Scholar 

  31. Bai S, Kolter JZ, Koltun V (2018) An empirical evaluation of generic convolutional and recurrent networks for sequence modeling." arXiv preprint ar Xiv:1803.01271

  32. Li S et al (2019) Distributed consensus algorithm for events detection in cyber-physical systems. IEEE Internet of Things J 6(2):2299–2308

    Article  Google Scholar 

  33. Khatami A et al (2020) A weight perturbation-based regularisation technique for convolutional neural networks and the application in medical imaging. Expert Syst Appl 149:113196

    Article  Google Scholar 

  34. Abidine BM et al (2018) The joint use of sequence features combination and modified weighted SVM for improving daily activity recognition. Pattern Anal Appl 21(1):119–138

    Article  MathSciNet  Google Scholar 

  35. Weiss GM, et al. (2016) Actitracker: a smartphone-based activity recognition system for improving health and well-being. 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA). IEEE

  36. Wang LuKun, Liu RuYue (2020) Human activity recognition based on wearable sensor using hierarchical deep LSTM networks. Circuits Syst Signal Process 39(2):837–856

    Article  Google Scholar 

  37. Lu W et al (2018) Wearable computing for Internet of Things: a discriminant approach for human activity recognition. IEEE Internet of Things J 6(2):2749–2759

    Article  Google Scholar 

  38. Wang H et al (2020) Wearable sensor-based human activity recognition using hybrid deep learning techniques. Secur Commun Netw 2020:1–12

    Google Scholar 

  39. Hassan MM et al (2018) A robust human activity recognition system using smartphone sensors and deep learning. Future Gener Comput Syst 81:307–313

    Article  Google Scholar 

  40. Mirmahboub B, Samavi S, Karimi N, Shirani S (2012) Automatic monocular system for human fall detection based on variations in silhouette area. IEEE Trans Biomed Eng, no. c, pp. 1–10

  41. Chaaraoui AA, Padilla-López JR, Climent-Pérez P, Flórez-Revuelta F (2014) Evolutionary joint selection to improve human action recognition with RGB-D devices. Expert Syst Appl 41(3):786–794

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. Alice Nithya.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ishwarya, K., Alice Nithya, A. Performance-enhanced real-time lifestyle tracking model based on human activity recognition (PERT-HAR) model through smartphones. J Supercomput 78, 5241–5268 (2022). https://doi.org/10.1007/s11227-021-04065-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-021-04065-z

Keywords

Navigation