Advertisement

Sensor-Based Human Activity and Behavior Computing

Chapter
  • 107 Downloads
Part of the Intelligent Systems Reference Library book series (ISRL, volume 207)

Abstract

A major goal of human activity and behavior recognition (HAR) is to recognize activities and behaviors from a series of action data for different subjects under different environmental conditions. The use of wearables, smart devices, and vision-based systems enables the collection of human action and behavior data for greater health benefits, rehabilitation, elderly care, and monitoring. This chapter overviews the state-of-the-art in activity and behavior recognition based on sensor data. First, we provided a general architecture of HAR describing primary data sources and pre-processing techniques. Then, we describe robust feature extraction and selection strategies with a comparative analysis of statistical and deep learning-based models. Additionally, we provide information about more than 100 benchmark datasets and repositories in this domain. The paper concludes with a discussion of major issues and challenges, highlighting open issues and scopes that need to be approached in prospective research.

References

  1. 1.
    Wang, H., Zhao, J., Li, J., Tian, L., Tu, P., Cao, T., An, Y., Wang, K., Li, S.: Wearable sensor-based human activity recognition using hybrid deep learning techniques. Secur. Commun. Netw. 2020 (2020)Google Scholar
  2. 2.
    Dang, L.M., Min, K., Wang, H., Piran, Md.J., Lee, C.H., Moon, H.: Sensor-based and vision-based human activity recognition: a comprehensive survey. Pattern Recog. 107561 (2020)Google Scholar
  3. 3.
    Ahad, M.A.R., Antar, A.D., Ahmed, M.: IoT Sensor-Based Activity Recognition. Springer (2020)Google Scholar
  4. 4.
    Ahad, M.A.R., Antar, A.D., Shahid, O.: Vision-based action understanding for assistive healthcare: a short review. In: CVPR Workshops, pp. 1–11 (2019)Google Scholar
  5. 5.
    Abdel-Salam, R., Mostafa, R., Hadhood, M.: Human activity recognition using wearable sensors: review, challenges, evaluation benchmark. arXiv preprint arXiv:2101.01665 (2021)
  6. 6.
    Abdul Lateef Haroon, P.S., et al.: Human activity recognition using machine learning approach. J. Robot. Control (JRC) 2(5), 395–399 (2021)Google Scholar
  7. 7.
    Zhou, X., Liang, W., Kevin, I., Wang, K., Wang, H., Yang, L.T., Jin, Q.: Deep-learning-enhanced human activity recognition for internet of healthcare things. IEEE Internet of Things J. 7(7), 6429–6438 (2020)Google Scholar
  8. 8.
    Meng, L., Zhang, A., Chen, C., Wang, X., Jiang, X., Tao, L., Fan, J., Wu, X., Dai, C., Zhang, Y., et al.: Exploration of human activity recognition using a single sensor for stroke survivors and able-bodied people. Sensors 21(3), 799 (2021)CrossRefGoogle Scholar
  9. 9.
    Sankar, S., Srinivasan, P., Saravanakumar, R.: Internet of things based ambient assisted living for elderly people health monitoring. Res. J. Pharm. Technol. 11(9), 3900–3904 (2018)CrossRefGoogle Scholar
  10. 10.
    Zhang, W., Caixia, S., He, C.: Rehabilitation exercise recognition and evaluation based on smart sensors with deep learning framework. IEEE Access 8, 77561–77571 (2020)CrossRefGoogle Scholar
  11. 11.
    Schrader, L., Toro, A.V., Konietzny, S., Rüping, S., Schäpers, B., Steinböck, M., Krewer, C., Müller, F., Güttler, J., Bock, T.: Advanced sensing and human activity recognition in early intervention and rehabilitation of elderly people. J. Population Ageing 1–27 (2020)Google Scholar
  12. 12.
    Irvine, N., Nugent, C., Zhang, S., Wang, H., Ng, W.W.Y.: Neural network ensembles for sensor-based human activity recognition within smart environments. Sensors 20(1), 216 (2020)CrossRefGoogle Scholar
  13. 13.
    Fahad, L.G., Tahir, S.F.: Activity recognition and anomaly detection in smart homes. Neurocomputing 423, 362–372 (2021)CrossRefGoogle Scholar
  14. 14.
    Batchuluun, G., Kim, J.H., Hong, H.G., Kang, J.K., Park, K.R.: Fuzzy system based human behavior recognition by combining behavior prediction and recognition. Expert Syst. Appl. 81, 108–133 (2017)CrossRefGoogle Scholar
  15. 15.
    Pirbhulal, S., Wu, W., Muhammad, K., Mehmood, I., Li, G.: de Albuquerque VHC: mobility enabled security for optimizing IoT based intelligent applications. IEEE Netw. 34(2), 72–77 (2020)CrossRefGoogle Scholar
  16. 16.
    Abdulla, A.I., Abdulraheem, A.S., Salih, A.A., Sadeeq, M.A.M., Ahmed, A.J., Ferzor, B.M., Sardar, O.S., Mohammed, S.I.: Internet of things and smart home security. Technol. Rep. Kansai Univ. 62(5), 2465–2476 (2020)Google Scholar
  17. 17.
    Shi, J., Zuo, D., Zhang, Z.: An energy-efficient human activity recognition system based on smartphones. In: 2020 7th International Conference on Soft Computing & Machine Intelligence (ISCMI), pp. 177–181. IEEE (2020)Google Scholar
  18. 18.
    Rawat, K.: Human activity recognition based on energy efficient schemes. Master’s thesis, University of Twente (2020)Google Scholar
  19. 19.
    Tarafdar, P., Bose, I.: Recognition of human activities for wellness management using a smartphone and a smartwatch: a boosting approach. Decis. Support Syst. 140 (2021)Google Scholar
  20. 20.
    Ishii, S., Yokokubo, A., Luimula, M., Lopez, G.: Exersense: physical exercise recognition and counting algorithm from wearables robust to positioning. Sensors 21(1), 91 (2021)CrossRefGoogle Scholar
  21. 21.
    Martindale, C.F., Christlein, V., Klumpp, P., Eskofier, B.M.: Wearables-based multi-task gait and activity segmentation using recurrent neural networks. Neurocomputing 432, 250–261 (2021)CrossRefGoogle Scholar
  22. 22.
    Ngo, T.T., Ahad, M.A.R., Antar, A.D., Ahmed, M., Muramatsu, D., Makihara, Y., Yagi, Y., Inoue, S., Hossain, T., Hattori, Y.: OU-ISIR wearable sensor-based gait challenge: age and gender. In: 2019 International Conference on Biometrics (ICB), pp. 1–6. IEEE (2019)Google Scholar
  23. 23.
    Ahad, M.A.R., Ngo, T.T., Antar, A.D., Ahmed, M., Hossain, T., Muramatsu, D., Makihara, Y., Inoue, S., Yagi, Y.: Wearable sensor-based gait analysis for age and gender estimation. Sensors 20(8), 2424 (2020)Google Scholar
  24. 24.
    Antar, A.D., Ahmed, M., Ishrak, M.S., Ahad, M.A.R.: A comparative approach to classification of locomotion and transportation modes using smartphone sensor data. In: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, pp. 1497–1502 (2018)Google Scholar
  25. 25.
    Ahmed, M., Antar, A.D., Hossain, T., Inoue, S., Ahad, M.A.R.: POIDEN: position and orientation independent deep ensemble network for the classification of locomotion and transportation modes. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 674–679 (2019)Google Scholar
  26. 26.
    Friedrich, B., Lübbe, C., Hein, A.: Analyzing the importance of sensors for mode of transportation classification. Sensors 21(1), 176 (2021)CrossRefGoogle Scholar
  27. 27.
    Wang, L., Gjoreski, H., Ciliberto, M., Lago, P., Murao, K., Okita, T., Roggen, D.: Summary of the Sussex-Huawei locomotion-transportation recognition challenge 2020. In: Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, pp. 351–358 (2020)Google Scholar
  28. 28.
    Liang, S.H.L., Saeedi, S., Ojagh, S., Honarparvar, S., Kiaei, S., Jahromi, M.M., Squires, J.: An interoperable architecture for the internet of Covid-19 things (IoCT) using open geospatial standards-case study: workplace reopening. Sensors 21(1), 50 (2021)CrossRefGoogle Scholar
  29. 29.
    Magesh, S., Niveditha, V.R., Rajakumar, P.S., Natrayan, L., et al.: Pervasive computing in the context of Covid-19 prediction with AI-based algorithms. Int. J. Pervasive Comput. Commun. (2020)Google Scholar
  30. 30.
    Zhang, H., Cai, Y., Zhang, H., Leung, C.: A hybrid framework for smart and safe working environments in the era of Covid-19. Int. J. Inf. Technol. 26(1) (2020)Google Scholar
  31. 31.
    Smith, P.D., Bedford, A.: Automatic classification of locomotion in sport: a case study from elite netball. Int. J. Comput. Sci. Sport 19(2), 1–20 (2020)CrossRefGoogle Scholar
  32. 32.
    Franco, A., Magnani, A., Maio, D.: A multimodal approach for human activity recognition based on skeleton and RGB data. Pattern Recogn. Lett. 131, 293–299 (2020)CrossRefGoogle Scholar
  33. 33.
    Shaikh, M.B., Chai, D.: RGB-D data-based action recognition: a review (2021)Google Scholar
  34. 34.
    Tavakoli, M., Carriere, J., Torabi, A.: Robotics, smart wearable technologies, and autonomous intelligent systems for healthcare during the Covid-19 pandemic: an analysis of the state of the art and future vision. Adv. Intell. Syst. 2(7), 2000071 (2020)CrossRefGoogle Scholar
  35. 35.
    Formica, D., Schena, E.: Smart sensors for healthcare and medical applications (2021)Google Scholar
  36. 36.
    Demrozi, F., Pravadelli, G., Bihorac, A., Rashidi, P.: Human activity recognition using inertial, physiological and environmental sensors: a comprehensive survey. IEEE Access (2020)Google Scholar
  37. 37.
    Aggarwal, J.K., Xia, L.: Human activity recognition from 3D data: a review. Pattern Recogn. Lett. 48, 70–80 (2014)CrossRefGoogle Scholar
  38. 38.
    Weinland, D., Özuysal, M., Fua, P.: Making action recognition robust to occlusions and viewpoint changes. In: European Conference on Computer Vision, pp. 635–648. Springer (2010)Google Scholar
  39. 39.
    Ahad, M.A.R.: Computer Vision and Action Recognition: A Guide for Image Processing and Computer Vision Community for Action Understanding, vol. 5. Springer (2011)Google Scholar
  40. 40.
    Holte, M.B., Moeslund, T.B., Nikolaidis, N., Pitas, I.: 3D human action recognition for multi-view camera systems. In: 2011 International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, pp. 342–349. IEEE (2011)Google Scholar
  41. 41.
    Chen, M., Hauptmann, A.: Mosift: recognizing human actions in surveillance videos (2009)Google Scholar
  42. 42.
    Donahue, J., Hendricks, L.A., Guadarrama, S., Rohrbach, M., Venugopalan, S., Saenko, K., Darrell, T.: Long-term recurrent convolutional networks for visual recognition and description. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2625–2634 (2015)Google Scholar
  43. 43.
    Poppe, R.: A survey on vision-based human action recognition. Image Vis. Comput. 28(6), 976–990 (2010)CrossRefGoogle Scholar
  44. 44.
    Liu, L., Peng, Y., Liu, M., Huang, Z.: Sensor-based human activity recognition system with a multilayered model using time series shapelets. Knowl.-Based Syst. 90, 138–152 (2015)CrossRefGoogle Scholar
  45. 45.
    Kanjo, E., Younis, E.M.G., Ang, C.S.: Deep learning analysis of mobile physiological, environmental and location sensor data for emotion detection. Inf. Fusion 49, 46–56 (2019)CrossRefGoogle Scholar
  46. 46.
    Mathie, M.: Monitoring and interpreting human movement patterns using a triaxial accelerometer. University of New South Wales Sydney (2003)Google Scholar
  47. 47.
    Huang, M., Zhao, G., Wang, L., Yang, F.: A pervasive simplified method for human movement pattern assessing. In: 2010 IEEE 16th International Conference on Parallel and Distributed Systems (ICPADS), pp. 625–628. IEEE (2010)Google Scholar
  48. 48.
    Liu, R., Zhou, J., Liu, M., Hou, X.: A wearable acceleration sensor system for gait recognition. In: 2nd IEEE Conference on Industrial Electronics and Applications, ICIEA 2007, pp. 2654–2659. IEEE (2007)Google Scholar
  49. 49.
    Wen, T., Wang, L., Gu, J., Huang, B.: An acceleration-based control framework for interactive gaming. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2009, pp. 2388–2391. IEEE (2009)Google Scholar
  50. 50.
    Antar, A.D., Ahmed, M., Ahad, M.A.R.: Challenges in sensor-based human activity recognition and a comparative analysis of benchmark datasets: a review. In: 2019 Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 2019 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), pp. 134–139. IEEE (2019)Google Scholar
  51. 51.
    Ahmed, M., Antar, A.D., Ahad, M.A.R.: An approach to classify human activities in real-time from smartphone sensor data. In: 2019 Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 2019 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), pp. 140–145. IEEE (2019)Google Scholar
  52. 52.
    Zhang, M., Sawchuk, A.A.: A feature selection-based framework for human activity recognition using wearable multimodal sensors. In: BodyNets, pp. 92–98 (2011)Google Scholar
  53. 53.
    Wang, A., Chen, G., Xi, W., Liu, L., An, N., Chang, C.-Y.: Towards human activity recognition: a hierarchical feature selection framework. Sensors 18(11), 3629 (2018)CrossRefGoogle Scholar
  54. 54.
    Wang, Z., Wu, D., Chen, J., Ghoneim, A., Hossain, M.A.: A triaxial accelerometer-based human activity recognition via EEMD-based features and game-theory-based feature selection. IEEE Sens. J. 16(9), 3198–3207 (2016)CrossRefGoogle Scholar
  55. 55.
    Zhang, L., Wu, X., Luo, D.: Real-time activity recognition on smartphones using deep neural networks. In: 2015 IEEE 12th International Conference on Ubiquitous Intelligence and Computing and 2015 IEEE 12th International Conference on Autonomic and Trusted Computing and 2015 IEEE 15th International Conference on Scalable Computing and Communications and Its Associated Workshops (UIC-ATC-ScalCom), pp. 1236–1242. IEEE (2015)Google Scholar
  56. 56.
    Bengio, Y.: Deep learning of representations: looking forward. In: International Conference on Statistical Language and Speech Processing, pp. 1–37. Springer (2013)Google Scholar
  57. 57.
    Cook, D., Feuz, K.D., Krishnan, N.C.: Transfer learning for activity recognition: a survey. Knowl. Inf. Syst. 36(3), 537–556 (2013)CrossRefGoogle Scholar
  58. 58.
    Ijjina, E.P., Chalavadi, K.M.: Human action recognition in RGB-D videos using motion sequence information and deep learning. Pattern Recogn. 72, 504–516 (2017)CrossRefGoogle Scholar
  59. 59.
    Seyfioğlu, M.S., Özbayoğlu, A.M., Gürbüz, S.Z.: Deep convolutional autoencoder for radar-based classification of similar aided and unaided human activities. IEEE Trans. Aerosp. Electron. Syst. 54(4), 1709–1723 (2018)CrossRefGoogle Scholar
  60. 60.
    Nguyen, T.N., Lee, S., Nguyen-Xuan, H., Lee, J.: A novel analysis-prediction approach for geometrically nonlinear problems using group method of data handling. Comput. Methods Appl. Mech. Eng. 354, 506–526 (2019)MathSciNetzbMATHCrossRefGoogle Scholar
  61. 61.
    Hossain, H.M.S., Al Hafiz Khan, Md.A., Roy, N.:. Active learning enabled activity recognition. Pervasive Mob. Comput. 38, 312–330 (2017)Google Scholar
  62. 62.
    Grzeszick, R., Lenk, J.M., Rueda, F.M., Fink, G.A., Feldhorst, S., ten Hompel, M.: Deep neural network based human activity recognition for the order picking process. In: Proceedings of the 4th International Workshop on Sensor-Based Activity Recognition and Interaction, pp. 1–6 (2017)Google Scholar
  63. 63.
    Gil-Martin, M., San-Segundo, R., Fernandez-Martinez, F., Ferreiros-López, J.: Improving physical activity recognition using a new deep learning architecture and post-processing techniques. Eng. Appl. Artif. Intell. 92 (2020)Google Scholar
  64. 64.
    Kwon, H., Tong, C., Haresamudram, H., Gao, Y., Abowd, G.D., Lane, N.D., Ploetz, T.: IMUTube: automatic extraction of virtual on-body accelerometry from video for human activity recognition. Proc. ACM Interact. Mob. Wearable Ubiquit. Technol. 4(3), 1–29 (2020)CrossRefGoogle Scholar
  65. 65.
    Rafiuddin, N., Khan, Y.U., Farooq, O.: Feature extraction and classification of EEG for automatic seizure detection. In: 2011 International Conference on Multimedia, Signal Processing and Communication Technologies (IMPACT), pp. 184–187. IEEE (2011)Google Scholar
  66. 66.
    Hossain, T., Goto, H., Ahad, M.A.R., Inoue, S.: A study on sensor-based activity recognition having missing data. In: 2018 Joint 7th International Conference on Informatics, Electronics & Vision (ICIEV) and 2018 2nd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), pp. 556–561. IEEE (2018)Google Scholar
  67. 67.
    Saha, S.S., Rahman, S., Rasna, M.J., Zahid, T.B., Mahfuzul Islam, A.K.M., Ahad, M.A.R.: Feature extraction, performance analysis and system design using the du mobility dataset. IEEE Access 6, 44776–44786 (2018)Google Scholar
  68. 68.
    Saha, S.S., Rahman, S., Rasna, M.J., Hossain, T., Inoue, S., Ahad, M.A.R.: Supervised and neural classifiers for locomotion analysis. In: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, pp. 1563–1570 (2018)Google Scholar
  69. 69.
    Hossain, T., Islam, Md.S., Ahad, M.A.R., Inoue, S.: Human activity recognition using earable device. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 81–84 (2019)Google Scholar
  70. 70.
    Hossain, T., Ahad, M.A.R., Tazin, T., Inoue, S.: Activity recognition by using lorawan sensor. In: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, pp. 58–61 (2018)Google Scholar
  71. 71.
    Veltink, P.H., Bussmann, H.B.J., De Vries, W., Martens, W.L.J., Van Lummel, R.C.: Detection of static and dynamic activities using uniaxial accelerometers. IEEE Trans. Rehabil. Eng. 4(4), 375–385 (1996)Google Scholar
  72. 72.
    Pirttikangas, S., Fujinami, K., Nakajima, T.: Feature selection and activity recognition from wearable sensors. In: International Symposium on Ubiquitious Computing Systems, pp. 516–527. Springer (2006)Google Scholar
  73. 73.
    Lavanya, B., Gayathri, G.S.: Exploration and deduction of sensor-based human activity recognition system of smart-phone data. In: 2017 IEEE International Conference on Computational Intelligence and Computing Research (ICCIC), pp. 1–5. IEEE (2017)Google Scholar
  74. 74.
    Bulling, A., Blanke, U., Schiele, B.: A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. (CSUR) 46(3), 1–33 (2014)CrossRefGoogle Scholar
  75. 75.
    Casale, P., Pujol, O., Radeva, P.: Human activity recognition from accelerometer data using a wearable device. In: Iberian Conference on Pattern Recognition and Image Analysis, pp. 289–296. Springer (2011)Google Scholar
  76. 76.
    Foerster, F., Smeja, M., Fahrenberg, J.: Detection of posture and motion by accelerometry: a validation study in ambulatory monitoring. Comput. Hum. Behav. 15, 571–583 (1999)CrossRefGoogle Scholar
  77. 77.
    Preece, S.J., Goulermas, J.Y., Kenney, L.P.J., Howard, D., Meijer, K., Crompton, R.: Physiological MeasurementGoogle Scholar
  78. 78.
    Englehart, K., Hudgins, B., Parker, P., Stevenson, M.: Time-frequency representation for classification of the transient myoelectric signal. In: Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 20, Biomedical Engineering Towards the Year 2000 and Beyond (Cat. No.98CH36286), vol. 5, pp. 2627–2630 (1998)Google Scholar
  79. 79.
    Bao, L., Intille, S.S.: Activity recognition from user-annotated acceleration data. In: Pervasive Computing, pp. 158–175. Springer, Heidelberg (2004)Google Scholar
  80. 80.
    Nyan, M.N., Tay, F.E.H., Seah, K.H.W., Sitoh, Y.Y.: Classification of gait patterns in the time-frequency domain. J. Biomech. 39(14), 2647–2656 (2006)CrossRefGoogle Scholar
  81. 81.
    Najafi, B., Aminian, K., Paraschiv-Ionescu, A., Loew, F., Bula, C.J., Robert, P.: Ambulatory system for human motion analysis using a kinematic sensor: monitoring of daily physical activity in the elderly. IEEE Trans. Biomed. Eng. 50(6), 711–723 (2003)CrossRefGoogle Scholar
  82. 82.
    Sekine, M., Tamura, T., Togawa, T., Fukui, Y.: Classification of waist-acceleration signals in a continuous walking record. Med. Eng. Phys. 22(4), 285–291 (2000)CrossRefGoogle Scholar
  83. 83.
    Mantyjarvi, J., Himberg, J., Seppanen, T.: Recognizing human motion with multiple acceleration sensors. In: IEEE International Conference on Systems, Man, and Cybernetics, vol. 2, pp. 747–752. IEEE (2001)Google Scholar
  84. 84.
    Hira, Z.M., Gillies, D.F.: A review of feature selection and feature extraction methods applied on microarray data. Adv. Bioinform. 2015 2015Google Scholar
  85. 85.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3(Mar), 1157–1182 (2003)zbMATHGoogle Scholar
  86. 86.
    Miao, J., Niu, L.: A survey on feature selection. Procedia Comput. Sci. 91, 919–926 (2016)CrossRefGoogle Scholar
  87. 87.
    Forman, G.: An extensive empirical study of feature selection metrics for text classification. J. Mach. Learn. Res. 3(Mar), 1289–1305 (2003)zbMATHGoogle Scholar
  88. 88.
    Chen, X., Jeong, J.C.: Enhanced recursive feature elimination. In: Sixth International Conference on Machine Learning and Applications (ICMLA 2007), pp. 429–435. IEEE (2007)Google Scholar
  89. 89.
    Talenti, L., Luck, M., Yartseva, A., Argy, N., Houzé, S., Damon, C.: L1 logistic regression as a feature selection step for training stable classification trees for the prediction of severity criteria in imported malaria. arXiv preprint arXiv:1511.06663 (2015)
  90. 90.
    van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9(Nov), 2579–2605 (2008)zbMATHGoogle Scholar
  91. 91.
    McInnes, L., Healy, J., Melville, J.: UMAP: uniform manifold approximation and projection for dimension reduction. arXiv preprint arXiv:1802.03426 (2018)
  92. 92.
    Ali, M.U., Ahmed, S., Ferzund, J., Mehmood, A., Rehman, A.: Using PCA and factor analysis for dimensionality reduction of bio-informatics data. arXiv preprint arXiv:1707.07189 (2017)
  93. 93.
    Shi, H., Yin, B., Zhang, X., Kang, Y., Lei, Y.: A landmark selection method for l-isomap based on greedy algorithm and its application. In: 2015 54th IEEE Conference on Decision and Control (CDC), pp. 7371–7376. IEEE (2015)Google Scholar
  94. 94.
    Hall, M.A.: Correlation-based feature selection for machine learning (1999)Google Scholar
  95. 95.
    Baranauskas, J.A., Netto, O.P., Nozawa, S.R., Macedo, A.A.: A tree-based algorithm for attribute selection. Appl. Intell. 48(4), 821–833 (2018)CrossRefGoogle Scholar
  96. 96.
    He, Z., Jin, L.: Activity recognition from acceleration data based on discrete consine transform and SVM. In: IEEE International Conference on Systems, Man and Cybernetics, SMC 2009, pp. 5041–5044. IEEE (2009)Google Scholar
  97. 97.
    He, Z.-Y., Jin, L.-W.: Activity recognition from acceleration data using AR model representation and SVM. In: International Conference on Machine Learning and Cybernetics, vol. 4, pp. 2245–2250. IEEE (2008)Google Scholar
  98. 98.
    The aware home. http://awarehome.imtc.gatech.edu. Accessed 17 Mar 2021
  99. 99.
    Ahad, M.A.R.: Motion History Images for Action Recognition and Understanding. Springer (2012)Google Scholar
  100. 100.
    Hossain, T., Islam, Md.S., Ahad, M.A.R., Inoue, S.: Human activity recognition using earable device. In: Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 81–84. ACM (2019)Google Scholar
  101. 101.
    Hossain, T., Ahad, M.A.R., Tazin, T., Inoue, S.: Activity recognition by using lorawan sensor. In: 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing and the 2018 International Symposium on Wearable Computers (UbiComp/ISWC) (2018)Google Scholar
  102. 102.
    Ahmed, M., Antar, A.D., Ahad, M.A.R.: An approach to classify human activities in real-time from smartphone sensor data. In: 2019 Joint 8th International Conference on Informatics, Electronics Vision (ICIEV) and 2019 3rd International Conference on Imaging, Vision Pattern Recognition (icIVPR), pp. 140–145 (2019)Google Scholar
  103. 103.
    Zheng, Y., Wong, W.-K., Guan, X., Trost, S.: Physical activity recognition from accelerometer data using a multi-scale ensemble method. In: Twenty-Fifth Annual Conference on Innovative Applications of Artificial Intelligence. IAAI (2013)Google Scholar
  104. 104.
    Reyes-Ortiz, J.-L., Oneto, L., Ghio, A., Sama, A., Anguita, D., Parra, X.: Human activity recognition on smartphones with awareness of basic activities and postural transitions. In: International Conference on Artificial Neural Networks, pp. 177–184 (2014)Google Scholar
  105. 105.
    Bahle, G., Gruenerbl, A., Lukowicz, P., Bignotti, E., Zeni, M., Giunchiglia, F.: Recognizing hospital care activities with a coat pocket worn smartphone. In: 6th International Conference on Mobile Computing, Applications and Services (MobiCASE), pp. 175–181. IEEE (2014)Google Scholar
  106. 106.
    Kawsar, F., Min, C., Mathur, A., Montanari, A.: Earables for personal-scale behavior analytics. IEEE Pervasive Comput. 17(3), 83–89 (2018)CrossRefGoogle Scholar
  107. 107.
    Tapia, E.M., Marmasse, N., Intille, S.S., Larson, K.: Mites: wireless portable sensors for studying behavior. In: Proceedings of Extended Abstracts Ubicomp 2004: Ubiquitous Computing (2004)Google Scholar
  108. 108.
    Mica2dot wireless microsensor mote. https://www.willow.co.uk/mpr5x0-_mica2dot_series.php. Accessed 22 Mar 2019
  109. 109.
    Kling, R.M., et al.: Intel mote: an enhanced sensor network node. In: International Workshop on Advanced Sensors, Structural Health Monitoring, and Smart Structures, pp. 12–17 (2003)Google Scholar
  110. 110.
  111. 111.
    De-La-Hoz-Franco, E., Ariza-Colpas, P., Quero, J.M., Espinilla, M.: Sensor-based datasets for human activity recognition-a systematic review of literature. IEEE Access 6, 59192–59210 (2018)CrossRefGoogle Scholar
  112. 112.
    Lichman. UCI machine learning repository (2013). http://archive.ics.uci.edu/ml. Accessed 14 Feb 2021
  113. 113.
    Blunck, H., Bhattacharya, S., Stisen, A., Prentow, T.S., Kjærgaard, M.B., Dey, A., Jensen, M.M., Sonne, T.: Activity recognition on smart devices: dealing with diversity in the wild. GetMobile: Mob. Comput. Commun. 20(1), 34–38 (2016)Google Scholar
  114. 114.
    Torres, R.L.S., Ranasinghe, D.C., Shi, Q., Sample, A.P.: Sensor enabled wearable RFID technology for mitigating the risk of falls near beds. In: 2013 IEEE International Conference on RFID (RFID), pp. 191–198. IEEE (2013)Google Scholar
  115. 115.
    Palumbo, F., Gallicchio, C., Pucci, R., Micheli, A.: Human activity recognition using multisensor data fusion based on reservoir computing. J. Ambient Intell. Smart Environ. 8(2), 87–107 (2016)CrossRefGoogle Scholar
  116. 116.
    Anguita, D., Ghio, A., Oneto, L., Parra, X., Reyes-Ortiz, J.L.: A public domain dataset for human activity recognition using smartphones. In: ESANN (2013)Google Scholar
  117. 117.
    Reyes-Ortiz, J.-L., Oneto, L., Samà, A., Parra, X., Anguita, D.: Transition-aware human activity recognition using smartphones. Neurocomputing 171, 754–767 (2016)CrossRefGoogle Scholar
  118. 118.
    Casale, P., Pujol, O., Radeva, P.: Personalization and user verification in wearable systems using biometric walking patterns. Pers. Ubiquit. Comput. 16(5), 563–580 (2012)CrossRefGoogle Scholar
  119. 119.
    Chavarriaga, R., Sagha, H., Calatroni, A., Digumarti, S.T., Tröster, G., Millán, J.D.R., Roggen, D.: The opportunity challenge: a benchmark database for on-body sensor-based activity recognition. Pattern Recogn. Lett. 34(15), 2033–2042 (2013)Google Scholar
  120. 120.
    Ordónez, F.J., de Toledo, P., Sanchis, A.: Activity recognition using hybrid generative/discriminative models on home environments using binary sensors. Sensors 13(5), 5460–5477 (2013)CrossRefGoogle Scholar
  121. 121.
    Baños, O., Damas, M., Pomares, H., Rojas, I., Tóth, M.A., Amft, O.: A benchmark dataset to evaluate sensor displacement in activity recognition. In: Proceedings of the 2012 ACM Conference on Ubiquitous Computing, pp. 1026–1035. ACM (2012)Google Scholar
  122. 122.
    Reiss, A., Stricker, D.: Introducing a new benchmarked dataset for activity monitoring. In: 2012 16th International Symposium on Wearable Computers (ISWC), pp. 108–109. IEEE (2012)Google Scholar
  123. 123.
    Altun, K., Barshan, B., Tunçel, O.: Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recogn. 43(10), 3605–3620 (2010)zbMATHCrossRefGoogle Scholar
  124. 124.
    Bacciu, D., Barsocchi, P., Chessa, S., Gallicchio, C., Micheli, A.: An experimental characterization of reservoir computing in ambient assisted living applications. Neural Comput. Appl. 24(6), 1451–1464 (2014)CrossRefGoogle Scholar
  125. 125.
    Banos, O., Garcia, R., Holgado-Terriza, J.A., Damas, M., Pomares, H., Rojas, I., Saez, A., Villalonga, C.: mhealthdroid: a novel framework for agile development of mobile health applications. In: International Workshop on Ambient Assisted Living, pp. 91–98. Springer (2014)Google Scholar
  126. 126.
    Weiss, G.M., Yoneda, K., Hayajneh, T.: Smartphone and smartwatch-based biometrics using activities of daily living. IEEE Access 7, 133190–133202 (2019)CrossRefGoogle Scholar
  127. 127.
    Schmidt, P., Reiss, A., Duerichen, R., Marberger, C., Van Laerhoven, K.: Introducing wesad, a multimodal dataset for wearable stress and affect detection. In: Proceedings of the 2018 on International Conference on Multimodal Interaction, pp. 400–408. ACM (2018)Google Scholar
  128. 128.
    Özdemir, A., Barshan, B.: Detecting falls with wearable sensors using machine learning techniques. Sensors 14(6), 10691–10708 (2014)CrossRefGoogle Scholar
  129. 129.
    Shoaib, M., Scholten, H., Havinga, P.J.M., Incel, O.D.: A hierarchical lazy smoking detection algorithm using smartwatch sensors. In: 2016 IEEE 18th International Conference on e-Health Networking, Applications and Services (Healthcom), pp. 1–6. IEEE (2016)Google Scholar
  130. 130.
    Shoaib, M., Bosch, S., Incel, O.D., Scholten, H., Havinga, P.J.M.: Complex human activity recognition using smartphone and wrist-worn motion sensors. Sensors 16(4), 426 (2016)CrossRefGoogle Scholar
  131. 131.
    Shoaib, M., Scholten, H., Havinga, P.J.M.: Towards physical activity recognition using smartphone sensors. In: 2013 IEEE 10th International Conference on and 10th International Conference on Autonomic and Trusted Computing (UIC/ATC), Ubiquitous Intelligence and Computing, pp. 80–87. IEEE (2013)Google Scholar
  132. 132.
    Shoaib, M., Bosch, S., Incel, O.D., Scholten, H., Havinga, P.J.M.: Fusion of smartphone motion sensors for physical activity recognition. Sensors 14(6), 10146–10176 (2014)CrossRefGoogle Scholar
  133. 133.
    Hasc2010 corpus. http://hasc.jp. Accessed 27 Mar 2019
  134. 134.
    Kawaguchi, N., Yang, Y., Yang, T., Ogawa, N., Iwasaki, Y., Kaji, K., Terada, T., Murao, K., Inoue, S., Kawahara, Y., et al.: Hasc2011corpus: towards the common ground of human activity recognition. In: Proceedings of the 13th International Conference on Ubiquitous Computing, pp. 571–572. ACM (2011)Google Scholar
  135. 135.
    Kawaguchi, N., Watanabe, H., Yang, T., Ogawa, N., Iwasaki, Y., Kaji, K., Terada, T., Murao, K., Hada, H., Inoue, S., et al.: Hasc2012corpus: large scale human activity corpus and its application. In: Proceedings of the Second International Workshop of Mobile Sensing: From Smartphones and Wearables to Big Data, pp. 10–14 (2012)Google Scholar
  136. 136.
    Kaji, K., Watanabe, H., Ban, R., Kawaguchi, N.: HASC-IPSC: indoor pedestrian sensing corpus with a balance of gender and age for indoor positioning and floor-plan generation researches. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, pp. 605–610. ACM (2013)Google Scholar
  137. 137.
    Ichino, H., Kaji, K., Sakurada, K., Hiroi, K., Kawaguchi, N.: HASC-PAC2016: large scale human pedestrian activity corpus and its baseline recognition. In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, pp. 705–714. ACM (2016)Google Scholar
  138. 138.
    Matsuyama, H., Hiroi, K., Kaji, K., Yonezawa, T., Kawaguchi, N.: Ballroom dance step type recognition by random forest using video and wearable sensor. In: Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 774–780. ACM (2019)Google Scholar
  139. 139.
    Chen, Y., Keogh, E., Hu, B., Begum, N., Bagnall, A., Mueen, A., Batista, G.: The UCR time series classification archive (2015). www.cs.ucr.edu/~eamonn/time_series_data 2016
  140. 140.
    Goldberger, A.L.: Physiobank, physiotoolkkit, and physionet: components of a new research resource for complex physiologic signals. Circulation 101(23), e215–e220 (2000)CrossRefGoogle Scholar
  141. 141.
    Kotz, D., Henderson, T.: Crawdad: a community resource for archiving wireless data at dartmouth. IEEE Pervasive Comput. 4(4), 12–14 (2005)CrossRefGoogle Scholar
  142. 142.
    Bachlin, M., Plotnik, M., Roggen, D., Maidan, I., Hausdorff, J.M., Giladi, N., Troster, G.: Wearable assistant for Parkinson’s disease patients with the freezing of gait symptom. IEEE Trans. Inf Technol. Biomed. 14(2), 436–446 (2010)CrossRefGoogle Scholar
  143. 143.
    Inoue, S., Ueda, N., Nohara, Y., Nakashima, N.: Recognizing and understanding nursing activities for a whole day with a big dataset. J. Inf. Process. 24(6), 853–866 (2016)Google Scholar
  144. 144.
    Inoue, S., Lago, P., Hossain, T., Mairittha, T., Mairittha, N.: Integrating activity recognition and nursing care records: the system, deployment, and a verification study. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 3, no. 86 (2019)Google Scholar
  145. 145.
    Predicting Parkinson’s disease progression with smartphone data. https://www.kaggle.com/c/3300/download/Participant%20Description.xls. Accessed 27 Mar 2019
  146. 146.
    Tian, Y., Zhang, J., Chen, L., Geng, Y., Wang, X.: Selective ensemble based on extreme learning machine for sensor-based human activity recognition. Sensors 19(16), 3468 (2019)CrossRefGoogle Scholar
  147. 147.
    Forster, K., Roggen, D., Troster, G.: Unsupervised classifier self-calibration through repeated context occurences: is there robustness against sensor displacement to gain? In: International Symposium on Wearable Computers, ISWC 2009, pp. 77–84. IEEE (2009)Google Scholar
  148. 148.
    Bächlin, M., Förster, K., Tröster, G.: Swimmaster: a wearable assistant for swimmer. In: Proceedings of the 11th International Conference on Ubiquitous Computing, pp. 215–224. ACM (2009)Google Scholar
  149. 149.
    Crowd-Sourced Fitbit Datasets (2016). Accessed 27 Mar 2019Google Scholar
  150. 150.
    Takata, M., Nakamura, Y., Fujimoto, M., Arakawa, Y., Yasumoto, K.: Investigating the effect of sensor position for training type recognition in a body weight training support system. In: Proceedings of the 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2018 ACM International Symposium on Wearable Computers, pp. 1–5. ACM (2018)Google Scholar
  151. 151.
    Tapia, E.M., Intille, S.S., Lopez, L., Larson, K.: The design of a portable kit of wireless sensors for naturalistic data collection. In: International Conference on Pervasive Computing, pp. 117–134. Springer (2006)Google Scholar
  152. 152.
    De la Torre, F., Hodgins, J., Bargteil, A., Martin, X., Macey, J., Collado, A., Beltran, P.: Guide to the Carnegie Mellon university multimodal activity (CMU-MMAC) database. Robotics Institute, p. 135 (2008)Google Scholar
  153. 153.
    Chen, L., Nugent, C.D., Biswas, J., Hoey, J.: Activity Recognition in Pervasive Intelligent Environments, vol. 4. Springer (2011)Google Scholar
  154. 154.
    Alemdar, H., Ertan, H., Incel, O.D., Ersoy, C.: Aras human activity datasets in multiple homes with multiple residents. In: Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare, pp. 232–235. ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering) (2013)Google Scholar
  155. 155.
    Gani, Md.O., Saha, A.K., Ahsan, G.M.T., Ahamed, S.I., Smith, R.O.: A novel framework to recognize complex human activity. In: 2017 IEEE 41st Annual Computer Software and Applications Conference (COMPSAC), pp. 948–956. IEEE (2017)Google Scholar
  156. 156.
    Cook, D.J.: Learning setting-generalized activity models for smart spaces. IEEE Intell. Syst. 27(1), 32–38 (2012)CrossRefGoogle Scholar
  157. 157.
    Tapia, E.M., Intille, S.S., Larson, K.: Activity recognition in the home using simple and ubiquitous sensors. In: International Conference on Pervasive Computing, pp. 158–175. Springer (2004)Google Scholar
  158. 158.
    Huynh, T., Fritz, M., Schiele, B.: Discovery of activity patterns using topic models. In: Proceedings of the 10th International Conference on Ubiquitous Computing, pp. 10–19. ACM (2008)Google Scholar
  159. 159.
    Activity classification. https://www.kaggle.com. Accessed 28 Mar 2019
  160. 160.
    Eagle, N., Pentland, A.S.: Reality mining: sensing complex social systems. Pers. Ubiquit. Comput. 10(4), 255–268 (2006)CrossRefGoogle Scholar
  161. 161.
    Laurila, J.K., Gatica-Perez, D., Aad, I., Bornet, O., Do, T.-M.-T., Dousse, O., Eberle, J., Miettinen, M., et al.: The mobile data challenge: big data for mobile computing research. In: Pervasive Computing, number EPFL-CONF-192489 (2012)Google Scholar
  162. 162.
    Wagner, D.T., Rice, A., Beresford, A.R.: Device analyzer: large-scale mobile data collection. ACM SIGMETRICS Perform. Eval. Rev. 41(4), 53–56 (2014)CrossRefGoogle Scholar
  163. 163.
    Rawassizadeh, R., Tomitsch, M., Nourizadeh, M., Momeni, E., Peery, A., Ulanova, L., Pazzani, M.: Energy-efficient integration of continuous context sensing and prediction into smartwatches. Sensors 15(9), 22616–22645 (2015)CrossRefGoogle Scholar
  164. 164.
    Gjoreski, H., Ciliberto, M., Wang, L., Morales, F.J.O., Mekki, S., Valentin, S., Roggen, D.: The university of Sussex-Huawei locomotion and transportation dataset for multimodal analytics with mobile devices. IEEE Access (2018)Google Scholar
  165. 165.
    Yang, J.: Toward physical activity diary: motion recognition using simple acceleration features with mobile phones. In: Proceedings of the 1st International Workshop on Interactive Multimedia for Consumer Electronics, pp. 1–10. ACM (2009)Google Scholar
  166. 166.
    Zheng, Yu., Xie, X., Ma, W.-Y.: Geolife: a collaborative social networking service among user, location and trajectory. IEEE Data Eng. Bull. 33(2), 32–39 (2010)Google Scholar
  167. 167.
    Wang, S., Chen, C., Ma, J.: Accelerometer based transportation mode recognition on mobile phones. In: 2010 Asia-Pacific Conference on Wearable Computing Systems (APWCS), pp. 44–46. IEEE (2010)Google Scholar
  168. 168.
    Reddy, S., Mun, M., Burke, J., Estrin, D., Hansen, M., Srivastava, M.: Using mobile phones to determine transportation modes. ACM Trans. Sens. Netw. (TOSN) 6(2), 13 (2010)Google Scholar
  169. 169.
    Siirtola, P., Röning, J.: Recognizing human activities user-independently on smartphones based on accelerometer data. IJIMAI 1(5), 38–45 (2012)CrossRefGoogle Scholar
  170. 170.
    Hemminki, S., Nurmi, P., Tarkoma, S.: Accelerometer-based transportation mode detection on smartphones. In: Proceedings of the 11th ACM Conference on Embedded Networked Sensor Systems, p. 13. ACM (2013)Google Scholar
  171. 171.
    Zhang, Z., Poslad, S.: A new post correction algorithm (POCOA) for improved transportation mode recognition. In: 2013 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 1512–1518. IEEE (2013)Google Scholar
  172. 172.
    Xia, H., Qiao, Y., Jian, J., Chang, Y.: Using smart phone sensors to detect transportation modes. Sensors 14(11), 20843–20865 (2014)CrossRefGoogle Scholar
  173. 173.
    Widhalm, P., Nitsche, P., Brändie, N.: Transport mode detection with realistic smartphone sensor data. In: 2012 21st International Conference on Pattern Recognition (ICPR), pp. 573–576. IEEE (2012)Google Scholar
  174. 174.
    Jahangiri, A., Rakha, H.A.: Applying machine learning techniques to transportation mode recognition using mobile phone sensor data. IEEE Trans. Intell. Transp. Syst. 16(5), 2406–2417 (2015)CrossRefGoogle Scholar
  175. 175.
    Xing, S., Caceres, H., Tong, H., He, Q.: Online travel mode identification using smartphones with battery saving considerations. IEEE Trans. Intell. Transp. Syst. 17(10), 2921–2934 (2016)CrossRefGoogle Scholar
  176. 176.
    Yu, M.-C., Yu, T., Wang, S.-C., Lin, C.-J., Chang, E.Y.: Big data small footprint: the design of a low-power classifier for detecting transportation modes. Proc. VLDB Endow. 7(13), 1429–1440 (2014)CrossRefGoogle Scholar
  177. 177.
    Gjoreski, H., Ciliberto, M., Wang, L., Morales, F.J.O., Mekki, S., Valentin, S., Roggen, D.: The university of Sussex-Huawei locomotion and transportation dataset for multimodal analytics with mobile devices. IEEE Access 6, 42592–42604 (2018)CrossRefGoogle Scholar
  178. 178.
    Carpineti, C., Lomonaco, V., Bedogni, L., Di Felice, M., Bononi, L.: Custom dual transportation mode detection by smartphone devices exploiting sensor diversity. In: 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), pp. 367–372. IEEE (2018)Google Scholar
  179. 179.
    Islam, Z.Z., Tazwar, S.M., Islam, Md.Z., Serikawa, S., Ahad, M.A.R.: Automatic fall detection system of unsupervised elderly people using smartphone. In: Annual Conference on Artificial Intelligence. IEEE (2017)Google Scholar
  180. 180.
    Frank, K., Nadales, M.J.V., Robertson, P., Pfeifer, T.: Bayesian recognition of motion related activities with inertial sensors. In: Proceedings of the 12th ACM International Conference Adjunct Papers on Ubiquitous Computing-Adjunct, pp. 445–446. ACM (2010)Google Scholar
  181. 181.
    Vavoulas, G., Pediaditis, M., Spanakis, E.G., Tsiknakis, M.: The mobifall dataset: an initial evaluation of fall detection algorithms using smartphones. In: 2013 IEEE 13th International Conference on Bioinformatics and bioengineering (BIBE), pp. 1–4. IEEE (2013)Google Scholar
  182. 182.
    Kwolek, B., Kepski, M.: Human fall detection on embedded platform using depth maps and wireless accelerometer. Comput. Methods Programs Biomed. 117(3), 489–501 (2014)CrossRefGoogle Scholar
  183. 183.
    Gasparrini, S., Cippitelli, E., Spinsante, S., Gambi, E.: A depth-based fall detection system using a Kinect® sensor. Sensors 14(2), 2756–2775 (2014)CrossRefGoogle Scholar
  184. 184.
    Ojetola, O., Gaura, E., Brusey, J.: Data set for fall events and daily activities from inertial sensors. In: Proceedings of the 6th ACM Multimedia Systems Conference, pp. 243–248. ACM (2015)Google Scholar
  185. 185.
    Vilarinho, T., Farshchian, B., Bajer, D.G., Dahl, O.H., Egge, I., Hegdal, S.S., Lønes, A., Slettevold, J.N., Weggersen, S.M.: A combined smartphone and smartwatch fall detection system. In: 2015 IEEE International Conference on Computer and Information Technology; Ubiquitous Computing and Communications; Dependable, Autonomic and Secure Computing; Pervasive Intelligence and Computing (CIT/IUCC/DASC/PICOM), pp. 1443–1448. IEEE (2015)Google Scholar
  186. 186.
    Micucci, D., Mobilio, M., Napoletano, P.: UniMiB SHAR: a dataset for human activity recognition using acceleration data from smartphones. Appl. Sci. 7(10), 1101 (2017)CrossRefGoogle Scholar
  187. 187.
    Casilari, E., Santoyo-Ramón, J.A., Cano-García, J.M.: Analysis of a smartphone-based architecture with multiple mobility sensors for fall detection. PLoS ONE 11(12), e0168069 (2016)CrossRefGoogle Scholar
  188. 188.
    Vavoulas, G., Chatzaki, C., Malliotakis, T., Pediaditis, M., Tsiknakis, M.: The mobiact dataset: recognition of activities of daily living using smartphones. In: ICT4AgeingWell, pp. 143–151 (2016)Google Scholar
  189. 189.
    Sucerquia, A., López, J.D., Vargas-Bonilla, J.F.: SisFall: a fall and movement dataset. Sensors 17(1), 198 (2017)CrossRefGoogle Scholar
  190. 190.
    Zhang, M., Sawchuk, A.A.: USC-HAD: a daily activity dataset for ubiquitous activity recognition using wearable sensors. In: Proceedings of the 2012 ACM Conference on Ubiquitous Computing, pp. 1036–1043. ACM (2012)Google Scholar
  191. 191.
    Yang, A.Y., Jafari, R., Sastry, S.S., Bajcsy, R.: Distributed recognition of human actions using wearable motion sensor networks. J. Ambient Intell. Smart Environ. 1(2), 103–115 (2009)CrossRefGoogle Scholar
  192. 192.
    Stiefmeier, T., Roggen, D., Troster, G.: Fusion of string-matched templates for continuous activity recognition. In: 2007 11th IEEE International Symposium on Wearable Computers, pp. 41–44. IEEE (2007)Google Scholar
  193. 193.
    Wirz, M., Roggen, D., Troster, G.: Decentralized detection of group formations from wearable acceleration sensors. In: International Conference on Computational Science and Engineering, CSE 2009, vol. 4, pp. 952–959. IEEE (2009)Google Scholar
  194. 194.
    Saha, S.S., Rahman, S., Rasna, M.J., Mahfuzul Islam, A.K.M., Ahad, M.A.R.: DU-MD: an open-source human action dataset for ubiquitous wearable sensors. In: Joint 7th International Conference on Informatics, Electronics & Vision; 2nd International Conference on Imaging, Vision & Pattern Recognition (2018)Google Scholar
  195. 195.
    Chereshnev, R., Kertész-Farkas, A.: HuGaDB: human gait database for activity recognition from wearable inertial sensor networks. In: International Conference on Analysis of Images, Social Networks and Texts, pp. 131–141. Springer (2017)Google Scholar
  196. 196.
    Ngo, T.T., Makihara, Y., Nagahara, H., Mukaigawa, Y., Yagi, Y.: The largest inertial sensor-based gait database and performance evaluation of gait-based personal authentication. Pattern Recogn. 47(1), 228–237 (2014)CrossRefGoogle Scholar
  197. 197.
    Ahad, M.A.R., Ngo, T.T., Antar, A.D., Ahmed, M., Hossain, T., Muramatsu, D., Makihara, Y., Inoue, S., Yagi, Y.: Wearable sensor-based gait analysis for age and gender estimation (2020)Google Scholar
  198. 198.
    Chen, C., Jafari, R., Kehtarnavaz, N.: UTD-MHAD: a multimodal dataset for human action recognition utilizing a depth camera and a wearable inertial sensor. In: 2015 IEEE International Conference on Image Processing (ICIP), pp. 168–172. IEEE (2015)Google Scholar
  199. 199.
    Ngo, T.T., Ahad, M.A.R., Antar, A.D., Ahmed, M., Muramatsu, D., Makihara, Y., Yagi, Y., Inoue, S., Hossain, T., Hattori, Y.: OU-ISIR wearable sensor-based gait challenge: age and gender. In: Proceedings of the 12th IAPR International Conference on Biometrics, ICB (2019)Google Scholar
  200. 200.
    Kwapisz, J.R., Weiss, G.M., Moore, S.A.: Activity recognition using cell phone accelerometers. ACM SigKDD Explor. Newsl. 12(2), 74–82 (2011)CrossRefGoogle Scholar
  201. 201.
    Faye, S., Louveton, N., Jafarnejad, S., Kryvchenko, R., Engel, T.: An open dataset for human activity analysis using smart devices (2017)Google Scholar
  202. 202.
    Nickerson, R.S.: Binary-classification reaction time: a review of some studies of human information-processing capabilities. Psychonomic Monograph Supplements (1972)Google Scholar
  203. 203.
    Unler, A., Murat, A.: A discrete particle swarm optimization method for feature selection in binary classification problems. Eur. J. Oper. Res. 206(3), 528–539 (2010)zbMATHCrossRefGoogle Scholar
  204. 204.
    Zhu, W., Lan, C., Xing, J., Zeng, W., Li, Y., Shen, L., Xie, X.: Co-occurrence feature learning for skeleton based action recognition using regularized deep LSTM networks. In: Thirtieth AAAI Conference on Artificial Intelligence (2016)Google Scholar
  205. 205.
    Li, S., Jiang, T., Huang, T., Tian, Y.: Global co-occurrence feature learning and active coordinate system conversion for skeleton-based action recognition. In: The IEEE Winter Conference on Applications of Computer Vision, pp. 586–594 (2020)Google Scholar
  206. 206.
    Zhao, R., Wang, K., Su, H., Ji, Q.: Bayesian graph convolution LSTM for skeleton based action recognition. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 6882–6892 (2019)Google Scholar
  207. 207.
    Si, C., Chen, W., Wang, W., Wang, L., Tan, T.: An attention enhanced graph convolutional LSTM network for skeleton-based action recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1227–1236 (2019)Google Scholar
  208. 208.
    Liu, J., Shahroudy, A., Xu, D., Kot, A.C., Wang, G.: Skeleton-based action recognition using spatio-temporal LSTM network with trust gates. IEEE Trans. Pattern Anal. Mach. Intell. 40(12), 3007–3021 (2017)CrossRefGoogle Scholar
  209. 209.
    Si, C., Jing, Y., Wang, W., Wang, L., Tan, T.: Skeleton-based action recognition with spatial reasoning and temporal stack learning. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 103–118 (2018)Google Scholar
  210. 210.
    Huang, J., Xiang, X., Gong, X., Zhang, B., et al.: Long-short graph memory network for skeleton-based action recognition. In: The IEEE Winter Conference on Applications of Computer Vision, pp. 645–652 (2020)Google Scholar
  211. 211.
    Ke, Q., Bennamoun, M., An, S., Sohel, F., Boussaid, F.: Learning clip representations for skeleton-based 3D action recognition. IEEE Trans. Image Process. 27(6), 2842–2855 (2018)MathSciNetzbMATHCrossRefGoogle Scholar
  212. 212.
    Luvizon, D., Picard, D., Tabia, H.: Multi-task deep learning for real-time 3D human pose estimation and action recognition. IEEE Trans. Pattern Anal. Mach. Intell. (2020)Google Scholar
  213. 213.
    Morais, R., Le, V., Tran, T., Saha, B., Mansour, M., Venkatesh, S.: Learning regularity in skeleton trajectories for anomaly detection in videos. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 11996–12004 (2019)Google Scholar
  214. 214.
    Zhang, P., Lan, C., Xing, J., Zeng, W., Xue, J., Zheng, N.: View adaptive recurrent neural networks for high performance human action recognition from skeleton data. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2117–2126 (2017)Google Scholar
  215. 215.
    Lee, I., Kim, D., Kang, S., Lee, S.: Ensemble deep learning for skeleton-based action recognition using temporal sliding LSTM networks. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1012–1020 (2017)Google Scholar
  216. 216.
    Tang, Y., Tian, Y., Lu, J., Li, P., Zhou, J.: Deep progressive reinforcement learning for skeleton-based action recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5323–5332 (2018)Google Scholar
  217. 217.
    Shi, L., Zhang, Y., Cheng, J., Lu, H.: Skeleton-based action recognition with directed graph neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7912–7921 (2019)Google Scholar
  218. 218.
    Cho, S., Maqbool, M., Liu, F., Foroosh, H.: Self-attention network for skeleton-based human action recognition. In: The IEEE Winter Conference on Applications of Computer Vision, pp. 635–644 (2020)Google Scholar
  219. 219.
    Baek, S., Kim, K.I., Kim, T.-K.: Augmented skeleton space transfer for depth-based hand pose estimation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 8330–8339 (2018)Google Scholar
  220. 220.
    Cai, J., Jiang, N., Han, X., Jia, K., Lu, J.: JOLO-GCN: mining joint-centered light-weight information for skeleton-based action recognition. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 2735–2744 (2021)Google Scholar
  221. 221.
    Ahad, M.A.R., Ngo, T.T., Antar, A.D., Ahmed, M., Hossain, T., Muramatsu, D., Makihara, Y., Inoue, S., Yagi, Y.: Wearable sensor-based gait analysis for age and gender estimation. Sensors 20(8) (2020)Google Scholar
  222. 222.
    Wang, L., Gjoreski, H., Ciliberto, M., Lago, P., Murao, K., Okita, T., Roggen, D.: Summary of the Sussex-Huawei locomotion-transportation recognition challenge 2020. In: Tentori, M., Weibel, N., Van Laerhoven, K., Abowd, G.D., Salim, F.D. (eds.) UbiComp/ISWC 2020: 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and 2020 ACM International Symposium on Wearable Computers, Virtual Event, Mexico, 12–17 September 2020, pp. 351–358. ACM (2020)Google Scholar
  223. 223.
    Alia, S.S., Lago, P., Takeda, S., Adachi, K., Benaissa, B., Ahad, M.A.R., Inoue, S.: Summary of the cooking activity recognition challenge. In: Human Activity Recognition Challenge, pp. 1–13. Springer (2021)Google Scholar
  224. 224.
    Alia, S.S., Lago, P., Adachi, K., Hossain, T., Goto, H., Okita, T., Inoue, S.: Summary of the 2nd nurse care activity recognition challenge using lab and field data. In: Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, pp. 378–383 (2020)Google Scholar
  225. 225.
    Komukai, K., Ohmura, R.: Optimizing of the number and placements of wearable IMUs for automatic rehabilitation recording. In: Kawaguchi, N., Nishio, N., Roggen, D., Inoue, S., Pirttikangas, S., van Laerhoven, K. (eds.) Human Activity Sensing. Springer Series in Adaptive Environments. Springer, Cham (2019)Google Scholar
  226. 226.
    Scholl, P.M., van Laerhoven, K.: Identifying sensors via statistical analysis of body-worn inertial sensor data. In: Kawaguchi, N., Nishio, N., Roggen, D., Inoue, S., Pirttikangas, S., van Laerhoven, K. (eds.) Human Activity Sensing. Springer Series in Adaptive Environments. Springer, Cham (2019)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2021

Authors and Affiliations

  1. 1.Department of Electrical and Electronic EngineeringUniversity of DhakaDhakaBangladesh
  2. 2.Department of CSEUniversity of MichiganAnn ArborUSA
  3. 3.Department of Intelligent MediaOsaka UniversitySuitaJapan
  4. 4.Department of Information SystemsUniversity of Maryland Baltimore CountyBaltimoreUSA

Personalised recommendations