Multiwindow Fusion for Wearable Activity Recognition
The recognition of human activity has been extensively investigated in the last decades. Typically, wearable sensors are used to register body motion signals that are analyzed by following a set of signal processing and machine learning steps to recognize the activity performed by the user. One of the most important steps refers to the signal segmentation, which is mainly performed through windowing approaches. In fact, it has been proved that the choice of window size directly conditions the performance of the recognition system. Thus, instead of limiting to a specific window configuration, this work proposes the use of multiple recognition systems operating on multiple window sizes. The suggested model employs a weighted decision fusion mechanism to fairly leverage the potential yielded by each recognition system based on the target activity set. This novel technique is benchmarked on a well-known activity recognition dataset. The obtained results show a significant improvement in terms of performance with respect to common systems operating on a single window size.
KeywordsActivity recognition Segmentation Windowing Wearable sensors Ensemble methods Data fusion
Unable to display preview. Download preview PDF.
- 2.Banos, O., Bilal-Amin, M., Ali-Khan, W., Afzel, M., Ali, T., Kang, B.-H., Lee, S.: Mining minds: an innovative framework for personalized health and wellness support. In: Int. Conf. on Pervasive Computing Technologies for Healthcare (2015)Google Scholar
- 3.Banos, O., Damas, M., Guillen, A., Herrera, L.-J., Pomares, H., Rojas, I., Villalonga, C.: Multi-sensor fusion based on asymmetric decision weighting for robust activity recognition. Neural Processing Letters, 1–22 (2014)Google Scholar
- 7.Banos, O., Damas, M., Pomares, H., Rojas, I., Toth, M.A., Amft, O.: A benchmark dataset to evaluate sensor displacement in activity recognition. In: Proceedings of the ACM Conference on Ubiquitous Computing, pp. 1026–1035 (2012)Google Scholar
- 12.Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley-Interscience (2000)Google Scholar
- 14.Kwapisz, J.R., Weiss, G.M., Moore, S.A.: Activity recognition using cell phone accelerometers. Conference on Knowledge Discovery and Data Mining 12(2), 74–82 (2011)Google Scholar
- 19.Maurer, U., Smailagic, A., Siewiorek, D.P., Deisher, M.: Activity recognition and monitoring using multiple sensors on different body positions. In: International Workshop on Wearable and Implantable Body Sensor Networks, pp. 113–116 (2006)Google Scholar
- 20.Mazilu, S., Blanke, U., Hardegger, M., Tröster, G., Gazit, E., Hausdorff, J.M.: Gaitassist: a daily-life support and training system for parkinson’s disease patients with freezing of gait. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2531–2540 (2014)Google Scholar
- 22.Ravi, N., Mysore, P., Littman, M.L.: Activity recognition from accelerometer data. In: Proceedings of the Conference on Innovative Applications of Artificial Intelligence, pp. 1541–1546 (2005)Google Scholar
- 23.Sama, A., Perez-Lopez, C., Romagosa, J., Rodriguez-Martin, D., Catala, A., Cabestany, J., Perez-Martinez, D.A., Rodriguez-Molinero, A.: Dyskinesia and motor state detection in parkinson’s disease patients with a single movement sensor. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1194–1197 (2012)Google Scholar
- 25.Theodoridis, S., Koutroumbas, K.: Pattern Recognition, 4th edn. Academic Press (2008)Google Scholar
- 26.Weiss, G.M., Lockhart, J.W., Pulickal, T.T., McHugh, P.T., Ronan, I.H., Timko, J.L.: Actitracker: a smartphone-based activity recognition system for improving health and well-being. SIGKDD Exploration Newsletter (2014)Google Scholar