Advertisement

Improving Wearable Activity Recognition via Fusion of Multiple Equally-Sized Data Subwindows

  • Oresti BanosEmail author
  • Juan-Manuel Galvez
  • Miguel Damas
  • Alberto Guillen
  • Luis-Javier Herrera
  • Hector Pomares
  • Ignacio Rojas
  • Claudia Villalonga
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11506)

Abstract

The automatic recognition of physical activities typically involves various signal processing and machine learning steps used to transform raw sensor data into activity labels. One crucial step has to do with the segmentation or windowing of the sensor data stream, as it has clear implications on the eventual accuracy level of the activity recogniser. While prior studies have proposed specific window sizes to generally achieve good recognition results, in this work we explore the potential of fusing multiple equally-sized subwindows to improve such recognition capabilities. We tested our approach for eight different subwindow sizes on a widely-used activity recognition dataset. The results show that the recognition performance can be increased up to 15% when using the fusion of equally-sized subwindows compared to using a classical single window.

Keywords

Activity recognition Segmentation Data window Data fusion Wearable sensors 

Notes

Acknowledgments

This work was partially supported by the Spanish Ministry of Economy and Competitiveness (MINECO) Projects TIN2015-71873-R and TIN2015-67020-P together with the European Fund for Regional Development (FEDER). This work was also partially funded by the “User Behaviour Sensing, Modelling and Analysis” contract OTRI-UGR-4071.

References

  1. 1.
    Banos, O., Damas, M., Pomares, H., Rojas, I., Toth, M.A., Amft, O.: A benchmark dataset to evaluate sensor displacement in activity recognition. In: ACM International Conference on Ubiquitous Computing, pp. 1026–1035 (2012)Google Scholar
  2. 2.
    Banos, O., Galvez, J.M., Damas, M., Pomares, H., Rojas, I.: Window size impact in human activity recognition. Sensors 14(4), 6474–6499 (2014)CrossRefGoogle Scholar
  3. 3.
    Banos, O., Toth, M.A., Damas, M., Pomares, H., Rojas, I.: Dealing with the effects of sensor displacement in wearable activity recognition. Sensors 14(6), 9995–10023 (2014)CrossRefGoogle Scholar
  4. 4.
    Banos, O., et al.: Multi-sensor fusion based on asymmetric decision weighting for robust activity recognition. Neural Process. Lett. 42(1), 5–26 (2015)CrossRefGoogle Scholar
  5. 5.
    Banos, O., et al.: Multiwindow fusion for wearable activity recognition. In: Rojas, I., Joya, G., Catala, A. (eds.) IWANN 2015. LNCS, vol. 9095, pp. 290–297. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-19222-2_24CrossRefGoogle Scholar
  6. 6.
    Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefGoogle Scholar
  7. 7.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley, Hoboken (2000)zbMATHGoogle Scholar
  8. 8.
    Ertugrul, O.F., Kaya, Y.: Determining the optimal number of body-worn sensors for human activity recognition. Soft Comput. 21(17), 5053–5060 (2017)CrossRefGoogle Scholar
  9. 9.
    Figo, D., Diniz, P.C., Ferreira, D.R., Cardoso, J.M.P.: Preprocessing techniques for context recognition from accelerometer data. Pers. Ubiquitous Comput. 14(7), 645–662 (2010)CrossRefGoogle Scholar
  10. 10.
    Forster, K., Roggen, D., Troster, G.: Unsupervised classifier self-calibration through repeated context occurences: is there robustness against sensor displacement to gain? In: International Symposium on Wearable Computers, pp. 77–84 (2009)Google Scholar
  11. 11.
    Guo, X., Liu, J., Chen, Y.: FitCoach: virtual fitness coach empowered by wearable mobile devices. In: IEEE Conference on Computer Communications, pp. 1–9. IEEE (2017)Google Scholar
  12. 12.
    Hur, T., Bang, J., Kim, D., Banos, O., Lee, S.: Smartphone location-independent physical activity recognition based on transportation natural vibration analysis. Sensors 17(4), 1–21 (2017)CrossRefGoogle Scholar
  13. 13.
    Jablonsky, N., McKenzie, S., Bangay, S., Wilkin, T.: Evaluating sensor placement and modality for activity recognition in active games. In: Australasian Computer Science Week Multiconference, p. 61 (2017)Google Scholar
  14. 14.
    Kwapisz, J.R., Weiss, G.M., Moore, S.A.: Activity recognition using cell phone accelerometers. Int. Conf. Knowl. Discov. Data Min. 12, 74–82 (2011)Google Scholar
  15. 15.
    Lam, W., Keung, C.K., Ling, C.X.: Learning good prototypes for classification using filtering and abstraction of instances. Pattern Recognit. 35(7), 1491–1506 (2002)CrossRefGoogle Scholar
  16. 16.
    Liu, K.C., Yen, C.Y., Chang, L.H., Hsieh, C.Y., Chan, C.T.: Wearable sensor-based activity recognition for housekeeping task. In: International Conference on Wearable and Implantable Body Sensor Networks, pp. 67–70. IEEE (2017)Google Scholar
  17. 17.
    Malaisé, A., Maurice, P., Colas, F., Charpillet, F., Ivaldi, S.: Activity recognition with multiple wearable sensors for industrial applications. In: International Conference on Advances in Computer-Human Interactions (2018)Google Scholar
  18. 18.
    Theodoridis, S., Koutroumbas, K.: Pattern Recognition, 4th edn. Academic Press, Cambridge (2008)zbMATHGoogle Scholar
  19. 19.
    Villalonga, C., Pomares, H., Rojas, I., Banos, O.: MIMU-Wear: ontology-based sensor selection for real-world wearable activity recognition. Neurocomputing 250, 76–100 (2017)CrossRefGoogle Scholar
  20. 20.
    Yurtman, A., Barshan, B.: Activity recognition invariant to sensor orientation with wearable motion sensors. Sensors 17(8), 1–24 (2017)CrossRefGoogle Scholar
  21. 21.
    Zappi, P., Roggen, D., Farella, E., Tröster, G., Benini, L.: Network-level power-performance trade-off in wearable activity recognition: a dynamic sensor selection approach. ACM Trans. Embed. Comput. Syst. 11(3), 68:1–68:30 (2012)CrossRefGoogle Scholar
  22. 22.
    Zhao, W., et al.: A human-centered activity tracking system: toward a healthier workplace. IEEE Trans. Hum.-Mach. Syst. 47(3), 343–355 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Research Center for Information and Communication TechnologiesUniversity of GranadaGranadaSpain
  2. 2.School of Engineering and TechnologyUniversidad Internacional de La RiojaLogroñoSpain

Personalised recommendations