Advertisement

Multimodal Classification of Activities of Daily Living Inside Smart Homes

  • Vit Libal
  • Bhuvana Ramabhadran
  • Nadia Mana
  • Fabio Pianesi
  • Paul Chippendale
  • Oswald Lanz
  • Gerasimos Potamianos
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5518)

Abstract

Smart homes for the aging population have recently started attracting the attention of the research community. One of the problems of interest is this of monitoring the activities of daily living (ADLs) of the elderly aiming at their protection and well-being. In this work, we present our initial efforts to automatically recognize ADLs using multimodal input from audio-visual sensors. For this purpose, and as part of Integrated Project Netcarity, far-field microphones and cameras have been installed inside an apartment and used to collect a corpus of ADLs, acted by multiple subjects. The resulting data streams are processed to generate perception-based acoustic features, as well as human location coordinates that are employed as visual features. The extracted features are then presented to Gaussian mixture models for their classification into a set of predefined ADLs. Our experimental results show that both acoustic and visual features are useful in ADL classification, but performance of the latter deteriorates when subject tracking becomes inaccurate. Furthermore, joint audio-visual classification by simple concatenative feature fusion significantly outperforms both unimodal classifiers.

Keywords

Visual Feature Gaussian Mixture Model Smart Home Visual Tracking Microphone Array 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Netcarity – Ambient Technology to Support Older People at Home, http://www.netcarity.org
  2. 2.
    Temko, A., Malkin, R., Zieger, C., Macho, D., Nadeu, C., Omologo, M.: CLEAR evaluation of acoustic event detection and classification systems. In: Stiefelhagen, R., Garofolo, J. (eds.) CLEAR 2006. LNCS, vol. 4122, pp. 311–322. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  3. 3.
    Stiefelhagen, R., Bernardin, K., Bowers, R., Travis Rose, R., Michel, M., Garofolo, J.: The CLEAR 2007 evaluation. In: Stiefelhagen, R., Bowers, R., Fiscus, J. (eds.) RT 2007 and CLEAR 2007. LNCS, vol. 4625, pp. 3–34. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  4. 4.
    Grassi, M., Lombardi, A., Rescio, G., Malcovati, P., Leone, A., Diraco, G., Distante, C., Siciliano, P., Malfatti, M., Gonzo, L., Libal, V., Huang, J., Potamianos, G.: A hardware-software framework for high-reliability people fall detection. In: Proc. IEEE Conf. on Sensors, Lecce, Italy, pp. 1328–1331 (2008)Google Scholar
  5. 5.
    Fleury, A., Vacher, M., Glasson, H., Serignat, J.-F., Noury, N.: Data fusion in health smart home: Preliminary individual evaluation of two families of sensors. In: Proc. Int. Conf. of the Int. Soc. for Gerontechnology, Pisa, Italy (2008)Google Scholar
  6. 6.
    Huang, J., Zhuang, X., Libal, V., Potamianos, G.: Long-time span acoustic activity analysis from far-field sensors in smart homes. In: Proc. Int. Conf. Acoustics, Speech, and Signal Process (ICASSP), Taipei, Taiwan (2009)Google Scholar
  7. 7.
    Wojek, C., Nickel, K., Stiefelhagen, R.: Activity recognition and room level tracking in an office environment. In: Proc. IEEE Int. Conf. on Multisensor Fusion and Integration for Intelligent Systems (MFI), Heidelberg, Germany (2006)Google Scholar
  8. 8.
    Cappelletti, A., Lepri, B., Mana, N., Pianesi, F., Zancanaro, M.: A multimodal data collection of daily activities in a real instrumented apartment. In: Proc. Works. on Multimodal Corpora: From Models of Natural Interaction to Systems and Applications – Held in Conjunction with the 6th Language Resources and Evaluation Conf. (LREC), Marrakech, Morocco (2008)Google Scholar
  9. 9.
    Lanz, O., Chippendale, P., Brunelli, R.: An appearance-based particle filter for visual tracking in smart rooms. In: Stiefelhagen, R., Bowers, R., Fiscus, J.G. (eds.) RT 2007 and CLEAR 2007. LNCS, vol. 4625, pp. 57–69. Springer, Heidelberg (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Vit Libal
    • 1
  • Bhuvana Ramabhadran
    • 1
  • Nadia Mana
    • 2
  • Fabio Pianesi
    • 2
  • Paul Chippendale
    • 2
  • Oswald Lanz
    • 2
  • Gerasimos Potamianos
    • 3
  1. 1.IBM Thomas J. Watson Research Center, Yorktown HeightsNew YorkU.S.A.
  2. 2.Fondazione Bruno Kessler (FBK)TrentoItaly
  3. 3.Institute of Computer Science (ICS), FORTHHeraklionGreece

Personalised recommendations