Improving Wearable Activity Recognition via Fusion of Multiple Equally-Sized Data Subwindows
- 1k Downloads
The automatic recognition of physical activities typically involves various signal processing and machine learning steps used to transform raw sensor data into activity labels. One crucial step has to do with the segmentation or windowing of the sensor data stream, as it has clear implications on the eventual accuracy level of the activity recogniser. While prior studies have proposed specific window sizes to generally achieve good recognition results, in this work we explore the potential of fusing multiple equally-sized subwindows to improve such recognition capabilities. We tested our approach for eight different subwindow sizes on a widely-used activity recognition dataset. The results show that the recognition performance can be increased up to 15% when using the fusion of equally-sized subwindows compared to using a classical single window.
KeywordsActivity recognition Segmentation Data window Data fusion Wearable sensors
This work was partially supported by the Spanish Ministry of Economy and Competitiveness (MINECO) Projects TIN2015-71873-R and TIN2015-67020-P together with the European Fund for Regional Development (FEDER). This work was also partially funded by the “User Behaviour Sensing, Modelling and Analysis” contract OTRI-UGR-4071.
- 1.Banos, O., Damas, M., Pomares, H., Rojas, I., Toth, M.A., Amft, O.: A benchmark dataset to evaluate sensor displacement in activity recognition. In: ACM International Conference on Ubiquitous Computing, pp. 1026–1035 (2012)Google Scholar
- 10.Forster, K., Roggen, D., Troster, G.: Unsupervised classifier self-calibration through repeated context occurences: is there robustness against sensor displacement to gain? In: International Symposium on Wearable Computers, pp. 77–84 (2009)Google Scholar
- 11.Guo, X., Liu, J., Chen, Y.: FitCoach: virtual fitness coach empowered by wearable mobile devices. In: IEEE Conference on Computer Communications, pp. 1–9. IEEE (2017)Google Scholar
- 13.Jablonsky, N., McKenzie, S., Bangay, S., Wilkin, T.: Evaluating sensor placement and modality for activity recognition in active games. In: Australasian Computer Science Week Multiconference, p. 61 (2017)Google Scholar
- 14.Kwapisz, J.R., Weiss, G.M., Moore, S.A.: Activity recognition using cell phone accelerometers. Int. Conf. Knowl. Discov. Data Min. 12, 74–82 (2011)Google Scholar
- 16.Liu, K.C., Yen, C.Y., Chang, L.H., Hsieh, C.Y., Chan, C.T.: Wearable sensor-based activity recognition for housekeeping task. In: International Conference on Wearable and Implantable Body Sensor Networks, pp. 67–70. IEEE (2017)Google Scholar
- 17.Malaisé, A., Maurice, P., Colas, F., Charpillet, F., Ivaldi, S.: Activity recognition with multiple wearable sensors for industrial applications. In: International Conference on Advances in Computer-Human Interactions (2018)Google Scholar