Skip to main content

Data Fusion for Human Activity Recognition Based on RF Sensing and IMU Sensor

  • Conference paper
  • First Online:
Body Area Networks. Smart IoT and Big Data for Intelligent Health Management (BODYNETS 2021)

Abstract

This paper proposes a new data fusion method, which uses the designed construction matrix to fuse sensor and USRP data to realise Human Activity Recognition. At this point, Inertial Measurement Unit sensors and Universal Software-defined Radio Peripherals are used to collect human activities signals separately. In order to avoid the incompatibility problem with different collection devices, such as different sampling frequency caused inconsistency time axis. The Principal Component Analysis processing the fused data to dimension reduction without time that is performed to extract the time unrelated \(5 \times 5\) feature matrix to represent corresponding activities. There are explores data fusion method between multiple devices and ensures accuracy without dropping. The technique can be extended to other types of hardware signal for data fusion.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abdi, H., Williams, L.J.: Principal component analysis. Wiley Interdisc. Rev. Comput. Stat. 2(4), 433–459 (2010)

    Article  Google Scholar 

  2. Aggarwal, J.K., Xia, L.: Human activity recognition from 3d data: a review. Pattern Recognit. Lett. 48, 70–80 (2014)

    Article  Google Scholar 

  3. Ahmed, H., Tahir, M.: Improving the accuracy of human body orientation estimation with wearable IMU sensors. IEEE Trans. instrum. Meas. 66(3), 535–542 (2017)

    Article  Google Scholar 

  4. Aoki, T., Lin, J.F.S., Kulić, D., Venture, G.: Segmentation of human upper body movement using multiple IMU sensors. In: 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 3163–3166. IEEE (2016)

    Google Scholar 

  5. Barde, A., Jain, S.: A survey of multi-sensor data fusion in wireless sensor networks. In: Proceedings of 3rd International Conference on Internet of Things and Connected Technologies (ICIoTCT), pp. 26–27 (2018)

    Google Scholar 

  6. Calvo, A.F., Holguin, G.A., Medeiros, H.: Human activity recognition using multi-modal data fusion. In: Vera-Rodriguez, R., Fierrez, J., Morales, A. (eds.) CIARP 2018. LNCS, vol. 11401, pp. 946–953. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-13469-3_109

    Chapter  Google Scholar 

  7. Chen, J., Sun, Y., Sun, S.: Improving human activity recognition performance by data fusion and feature engineering. Sensors 21(3), 692 (2021)

    Article  Google Scholar 

  8. Chung, S., Lim, J., Noh, K.J., Kim, G., Jeong, H.: Sensor data acquisition and multimodal sensor fusion for human activity recognition using deep learning. Sensors 19(7), 1716 (2019)

    Article  Google Scholar 

  9. De Leonardis, G., et al.: Human activity recognition by wearable sensors: Comparison of different classifiers for real-time applications. In: 2018 IEEE International Symposium on Medical Measurements and Applications (MeMeA), pp. 1–6. IEEE (2018)

    Google Scholar 

  10. Ettus, M., Braun, M.: The universal software radio peripheral (USRP) family of low-cost SDRs. Oppor. Spectr. Shar. White Space Access Pract. Real., 3–23 (2015)

    Google Scholar 

  11. Fletcher, R.R., Poh, M.Z., Eydgahi, H.: Wearable sensors: opportunities and challenges for low-cost health care. In: 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, pp. 1763–1766. IEEE (2010)

    Google Scholar 

  12. Garofalo, G., Argones Rúa, E., Preuveneers, D., Joosen, W., et al.: A systematic comparison of age and gender prediction on IMU sensor-based gait traces. Sensors 19(13), 2945 (2019)

    Article  Google Scholar 

  13. Hua, M.D., Manerikar, N., Hamel, T., Samson, C.: Attitude, linear velocity and depth estimation of a camera observing a planar target using continuous homography and inertial data. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 1429–1435. IEEE (2018)

    Google Scholar 

  14. Huang, Z., Fan, J., Cheng, S., Yi, S., Wang, X., Li, H.: HMS-Net: hierarchical multi-scale sparsity-invariant network for sparse depth completion. IEEE Trans. Image Process. 29, 3429–3441 (2019)

    Article  Google Scholar 

  15. Khuon, T., Rand, R.: Adaptive automatic object recognition in single and multi-modal sensor data. In: 2014 IEEE Applied Imagery Pattern Recognition Workshop (AIPR), pp. 1–8. IEEE (2014)

    Google Scholar 

  16. Lara, O.D., Labrador, M.A.: A survey on human activity recognition using wearable sensors. IEEE Commun. Surv. Tutor. 15(3), 1192–1209 (2012)

    Article  Google Scholar 

  17. Li, H., Shrestha, A., Heidari, H., Le Kernec, J., Fioranelli, F.: Magnetic and radar sensing for multimodal remote health monitoring. IEEE Sens. J. 19(20), 8979–8989 (2018)

    Article  Google Scholar 

  18. Li, H., Shrestha, A., Heidari, H., Le Kernec, J., Fioranelli, F.: Bi-LSTM network for multimodal continuous human activity recognition and fall detection. IEEE Sens. J. 20(3), 1191–1201 (2019)

    Article  Google Scholar 

  19. Li, X., He, Y., Jing, X.: A survey of deep learning-based human activity recognition in radar. Remote Sens. 11(9), 1068 (2019)

    Article  Google Scholar 

  20. Liang, M., Yang, B., Chen, Y., Hu, R., Urtasun, R.: Multi-task multi-sensor fusion for 3d object detection. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 7345–7353 (2019)

    Google Scholar 

  21. Majumder, S., Pratihar, D.K.: Multi-sensors data fusion through fuzzy clustering and predictive tools. Expert Syst. Appl. 107, 165–172 (2018)

    Article  Google Scholar 

  22. Majumder, S., Mondal, T., Deen, M.J.: Wearable sensors for remote health monitoring. Sensors 17(1), 130 (2017)

    Article  Google Scholar 

  23. Mönks, U., Dörksen, H., Lohweg, V., Hübner, M.: Information fusion of conflicting input data. Sensors 16(11), 1798 (2016)

    Article  Google Scholar 

  24. Muzammal, M., Talat, R., Sodhro, A.H., Pirbhulal, S.: A multi-sensor data fusion enabled ensemble approach for medical data from body sensor networks. Inf. Fusion 53, 155–164 (2020)

    Article  Google Scholar 

  25. Noshad, Z., et al.: Fault detection in wireless sensor networks through the random forest classifier. Sensors 19(7), 1568 (2019)

    Article  Google Scholar 

  26. Olivier, B., Pierre, G., Nicolas, H., Loïc, O., Olivier, T., Philippe, T.: Multi sensor data fusion architectures for Air Traffic Control Applications. Citeseer (2009)

    Google Scholar 

  27. Patel, S., Park, H., Bonato, P., Chan, L., Rodgers, M.: A review of wearable sensors and systems with application in rehabilitation. J. Neuroeng. Rehabil. 9(1), 1–17 (2012)

    Article  Google Scholar 

  28. Shah, S.A., Fioranelli, F.: Human activity recognition: preliminary results for dataset portability using FMCW radar. In: 2019 International Radar Conference (RADAR), pp. 1–4. IEEE (2019)

    Google Scholar 

  29. Spörri, J., Kröll, J., Fasel, B., Aminian, K., Müller, E.: The use of body worn sensors for detecting the vibrations acting on the lower back in alpine ski racing. Front. Physiol. 8, 522 (2017)

    Article  Google Scholar 

  30. Taylor, W., Shah, S.A., Dashtipour, K., Zahid, A., Abbasi, Q.H., Imran, M.A.: An intelligent non-invasive real-time human activity recognition system for next-generation healthcare. Sensors 20(9), 2653 (2020)

    Article  Google Scholar 

  31. Wang, L., Li, S.: Enhanced multi-sensor data fusion methodology based on multiple model estimation for integrated navigation system. Int. J. Control Autom. Syst. 16(1), 295–305 (2018). https://doi.org/10.1007/s12555-016-0200-x

    Article  Google Scholar 

  32. Xu, Y., et al.: Advanced multi-sensor optical remote sensing for urban land use and land cover classification: outcome of the 2018 IEEE GRSS data fusion contest. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 12(6), 1709–1724 (2019)

    Article  Google Scholar 

  33. Yang, S., Yu, Z.: A highly integrated hardware/software co-design and co-verification platform. IEEE Des. Test 36(1), 23–30 (2018)

    Article  Google Scholar 

  34. Yu, Z., Abdulghani, A.M., Zahid, A., Heidari, H., Imran, M.A., Abbasi, Q.H.: An overview of neuromorphic computing for artificial intelligence enabled hardware-based hopfield neural network. IEEE Access 8, 67085–67099 (2020)

    Article  Google Scholar 

  35. Yu, Z., et al.: Energy and performance trade-off optimization in heterogeneous computing via reinforcement learning. Electronics 9(11), 1812 (2020)

    Article  Google Scholar 

  36. Yu, Z., Yang, S., Sillitoe, I., Buckley, K.: Towards a scalable hardware/software co-design platform for real-time pedestrian tracking based on a ZYNQ-7000 device. In: 2017 IEEE International Conference on Consumer Electronics-Asia (ICCE-Asia), pp. 127–132. IEEE (2017)

    Google Scholar 

  37. Yu, Z., et al.: Hardware-based hopfield neuromorphic computing for fall detection. Sensors 20(24), 7226 (2020)

    Article  Google Scholar 

  38. Zhu, Y., Liu, D., Grosu, R., Wang, X., Duan, H., Wang, G.: A multi-sensor data fusion approach for atrial hypertrophy disease diagnosis based on characterized support vector hyperspheres. Sensors 17(9), 2049 (2017)

    Article  Google Scholar 

  39. Zou, H., Yang, J., Prasanna Das, H., Liu, H., Zhou, Y., Spanos, C.J.: WiFi and vision multimodal learning for accurate and robust device-free human activity recognition. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (2019)

    Google Scholar 

Download references

Acknowledgements

Zheqi Yu is funded by Joint industrial scholarship (Ref:308987) between the University of Glasgow and Transreport London Ltd. Authors would also like to thank Francesco Fioranelli and Haobo Li for supporting Human Activities Dataset.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zheqi Yu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yu, Z. et al. (2022). Data Fusion for Human Activity Recognition Based on RF Sensing and IMU Sensor. In: Ur Rehman, M., Zoha, A. (eds) Body Area Networks. Smart IoT and Big Data for Intelligent Health Management. BODYNETS 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 420. Springer, Cham. https://doi.org/10.1007/978-3-030-95593-9_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-95593-9_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-95592-2

  • Online ISBN: 978-3-030-95593-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics