aHead: Considering the Head Position in a Multi-sensory Setup of Wearables to Recognize Everyday Activities with Intelligent Sensor Fusions

  • Marian Haescher
  • John Trimpop
  • Denys J. C. Matthies
  • Gerald Bieber
  • Bodo Urban
  • Thomas Kirste
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9170)

Abstract

In this paper we examine the feasibility of Human Activity Recognition (HAR) based on head mounted sensors, both as stand-alone sensors and as part of a wearable multi-sensory network. To prove the feasibility of such setting, an interactive online HAR-system has been implemented to enable for multi-sensory activity recognition while making use of a hierarchical sensor fusion. Our system incorporates 3 sensor positions distributed over the body, which are head (smart glasses), wrist (smartwatch), and hip (smartphone). We are able to reliably distinguish 7 daily activities, which are: resting, being active, walking, running, jumping, cycling and office work. The results of our field study with 14 participants clearly indicate that the head position is applicable for HAR. Moreover, we demonstrate an intelligent multi-sensory fusion concept that increases the recognition performance up to 86.13 % (recall). Furthermore, we found the head to possess very distinctive movement patterns regarding activities of daily living.

Keywords

Human activity recognition HAR Human computer interaction Pattern recognition Multi-Sensory Wearable computing Mobile assistance 

Notes

Acknowledgements

This research has been supported by the German Federal State of Mecklenburg-Western Pomerania and the European Social Fund; grant ESF/IV-BM-B35-0006/12.

References

  1. 1.
    Avci, A., Bosch, S., Marin-Perianu, M., Marin-Perianu, R., Havinga, P.: Activity recognition using inertial sensing for healthcare, wellbeing and sports applications: a survey. In: Proceedings of Architecture of Computing Systems (ARCS), pp.1–10. VDE (2010)Google Scholar
  2. 2.
    Bao, L., Intille, S.S.: Activity recognition from user-annotated acceleration data. Pervasive Computing, pp. 1–17. Springer, Berlin Heidelberg (2004)CrossRefGoogle Scholar
  3. 3.
    Bieber, G., Haescher, M., Vahl, M.: Sensor requirements for activity recognition on smart watches. In: Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments, 67. ACM (2013)Google Scholar
  4. 4.
    Bieber, G., Luthardt, A., Peter, C., Urban, B. The hearing trousers pocket: activity recognition by alternative sensors. In Proceedings of PETRA, 44. ACM (2011)Google Scholar
  5. 5.
    Bieber, G., Voskamp, J., Urban, B.: Activity recognition for everyday life on mobile phones. In: Stephanidis, C. (ed.) UAHCI 2009, Part II. LNCS, vol. 5615, pp. 289–296. Springer, Heidelberg (2009)Google Scholar
  6. 6.
    Bulling, A., Blanke, U., Schiele, B.: A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. (CSUR) 46(3), 1–33 (2014)CrossRefGoogle Scholar
  7. 7.
    Ermes, M., Parkka, J., Cluitmans, L.: Advancing from offline to online activity recognition with wearable sensors. In: Proceedings of EMBS, pp. 4451–4454. IEEE (2008)Google Scholar
  8. 8.
    Haescher, M., Bieber, G., Trimpop, J., Urban, B., Kirste, T., Salomon, R.: Recognition of Low Amplitude Body Vibrations via Inertial Sensors for Wearable Computing. In: Proceedings of Conference on IoT Technologies for HealthCare (HealthyIoT), Springer (2014)Google Scholar
  9. 9.
    Hanheide, M., Bauckhage, C., Sagerer, G.: Combining environmental cues and head gestures to interact with wearable devices. In: Proceedings of the 7th international conference on Multimodal interfaces, pp. 25–31. ACM (2005)Google Scholar
  10. 10.
    Ishimaru, S., Kunze, K., Kise, K., Weppner, J., Dengel, A., Lukowicz, P., Bulling, A.: In the blink of an eye: combining head motion and eye blink frequency for activity recognition with google glass. In: Augmented Human International Conference. 15. ACM (2014)Google Scholar
  11. 11.
    Lara, O.D., Labrador, M.A.: A survey on human activity recognition using wearable sensors. Commun. Surv. Tutorials IEEE 15(3), 1192–1209 (2013)CrossRefGoogle Scholar
  12. 12.
    Lindemann, U., Hock, A., Stuber, M., Keck, W., Becker, C.: Evaluation of a fall detector based on accelerometers: a pilot study. Med. Biol. Eng. Comput. 43(5), 548–551 (2005)CrossRefGoogle Scholar
  13. 13.
    Madabhushi, A., Aggarwal, J.K.: Using head movement to recognize activity. In: Proceedings of 15th International Conference on Pattern Recognition vol. 4, pp. 698–701. IEEE (2000)Google Scholar
  14. 14.
    Maurer, U., Smailagic, A., Siewiorek, D.P., Deisher, M.: Activity recognition and monitoring using multiple sensors on different body positions. In: International Workshop on Wearable and Implantable Body Sensor Networks. IEEE (2006)Google Scholar
  15. 15.
    Menz, H.B., Lord, S.R., Fitzpatrick, R.C.: Age-related differences in walking stability. Age Ageing 32(2), 137–142 (2003)CrossRefGoogle Scholar
  16. 16.
    Nam, Y., Rho, S., Lee, C.: Physical activity recognition using multiple sensors embedded in a wearable device. In Proceedings of TECS, 12(2), 26. ACM (2013)Google Scholar
  17. 17.
    Peter, C., Bieber, G., Urban, B.: Affect-and behaviour-related assistance for families in the home environment. In: Proceedings of PETRA, 47. ACM (2010)Google Scholar
  18. 18.
    Polikar, R.: Ensemble based systems in decision making. IEEE Circuits Syst. Mag. 6(3), 21–45 (2006)CrossRefGoogle Scholar
  19. 19.
    Windau, J., Itti, L.: Situation awareness via sensor-equipped eyeglasses. In: International Conference on Intelligent Robots and Systems (IROS), pp. 5674–5679. IEEE (2013)Google Scholar
  20. 20.
    Ward, J.A., Lukowicz, P., Troster, G., Starner, T.E.: Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1553–1567 (2006)CrossRefGoogle Scholar
  21. 21.
    Yang, C.C., Hsu, Y.L.: A review of accelerometry-based wearable motion detectors for physical activity monitoring. Sensors 10(8), 7772–7788 (2010)CrossRefGoogle Scholar
  22. 22.
    Zhang, M., Sawchuk, A.A.: A feature selection-based framework for human activity recognition using wearable multimodal sensors. In: Proceedings of the 6th International Conference on Body Area Networks, pp. 92–98 (2011) Google Scholar
  23. 23.
    Zappi, P., Stiefmeier, T., Farella, E., Roggen, D., Benini, L., Tröster, G.: Activity recognition from on-body sensors by classifier fusion: sensor scalability and robustness. In: Conference on Intelligent Sensors, Sensor Networks and Information, pp. 281–286. IEEE (2007)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Marian Haescher
    • 1
  • John Trimpop
    • 1
  • Denys J. C. Matthies
    • 1
  • Gerald Bieber
    • 1
  • Bodo Urban
    • 1
  • Thomas Kirste
    • 2
  1. 1.Fraunhofer IGDRostockGermany
  2. 2.University of RostockRostockGermany

Personalised recommendations