Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5518))

Included in the following conference series:

Abstract

Smart homes for the aging population have recently started attracting the attention of the research community. One of the problems of interest is this of monitoring the activities of daily living (ADLs) of the elderly aiming at their protection and well-being. In this work, we present our initial efforts to automatically recognize ADLs using multimodal input from audio-visual sensors. For this purpose, and as part of Integrated Project Netcarity, far-field microphones and cameras have been installed inside an apartment and used to collect a corpus of ADLs, acted by multiple subjects. The resulting data streams are processed to generate perception-based acoustic features, as well as human location coordinates that are employed as visual features. The extracted features are then presented to Gaussian mixture models for their classification into a set of predefined ADLs. Our experimental results show that both acoustic and visual features are useful in ADL classification, but performance of the latter deteriorates when subject tracking becomes inaccurate. Furthermore, joint audio-visual classification by simple concatenative feature fusion significantly outperforms both unimodal classifiers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Netcarity – Ambient Technology to Support Older People at Home, http://www.netcarity.org

  2. Temko, A., Malkin, R., Zieger, C., Macho, D., Nadeu, C., Omologo, M.: CLEAR evaluation of acoustic event detection and classification systems. In: Stiefelhagen, R., Garofolo, J. (eds.) CLEAR 2006. LNCS, vol. 4122, pp. 311–322. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  3. Stiefelhagen, R., Bernardin, K., Bowers, R., Travis Rose, R., Michel, M., Garofolo, J.: The CLEAR 2007 evaluation. In: Stiefelhagen, R., Bowers, R., Fiscus, J. (eds.) RT 2007 and CLEAR 2007. LNCS, vol. 4625, pp. 3–34. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  4. Grassi, M., Lombardi, A., Rescio, G., Malcovati, P., Leone, A., Diraco, G., Distante, C., Siciliano, P., Malfatti, M., Gonzo, L., Libal, V., Huang, J., Potamianos, G.: A hardware-software framework for high-reliability people fall detection. In: Proc. IEEE Conf. on Sensors, Lecce, Italy, pp. 1328–1331 (2008)

    Google Scholar 

  5. Fleury, A., Vacher, M., Glasson, H., Serignat, J.-F., Noury, N.: Data fusion in health smart home: Preliminary individual evaluation of two families of sensors. In: Proc. Int. Conf. of the Int. Soc. for Gerontechnology, Pisa, Italy (2008)

    Google Scholar 

  6. Huang, J., Zhuang, X., Libal, V., Potamianos, G.: Long-time span acoustic activity analysis from far-field sensors in smart homes. In: Proc. Int. Conf. Acoustics, Speech, and Signal Process (ICASSP), Taipei, Taiwan (2009)

    Google Scholar 

  7. Wojek, C., Nickel, K., Stiefelhagen, R.: Activity recognition and room level tracking in an office environment. In: Proc. IEEE Int. Conf. on Multisensor Fusion and Integration for Intelligent Systems (MFI), Heidelberg, Germany (2006)

    Google Scholar 

  8. Cappelletti, A., Lepri, B., Mana, N., Pianesi, F., Zancanaro, M.: A multimodal data collection of daily activities in a real instrumented apartment. In: Proc. Works. on Multimodal Corpora: From Models of Natural Interaction to Systems and Applications – Held in Conjunction with the 6th Language Resources and Evaluation Conf. (LREC), Marrakech, Morocco (2008)

    Google Scholar 

  9. Lanz, O., Chippendale, P., Brunelli, R.: An appearance-based particle filter for visual tracking in smart rooms. In: Stiefelhagen, R., Bowers, R., Fiscus, J.G. (eds.) RT 2007 and CLEAR 2007. LNCS, vol. 4625, pp. 57–69. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Libal, V. et al. (2009). Multimodal Classification of Activities of Daily Living Inside Smart Homes. In: Omatu, S., et al. Distributed Computing, Artificial Intelligence, Bioinformatics, Soft Computing, and Ambient Assisted Living. IWANN 2009. Lecture Notes in Computer Science, vol 5518. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02481-8_103

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02481-8_103

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02480-1

  • Online ISBN: 978-3-642-02481-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics