Personal and Ubiquitous Computing

, Volume 7, Issue 5, pp 263–274 | Cite as

Wearable sensing to annotate meeting recordings

  • Nicky KernEmail author
  • Bernt Schiele
  • Holger Junker
  • Paul Lukowicz
  • Gerhard Tröster
Original Article


We propose to use wearable computers and sensor systems to generate personal contextual annotations in audio visual recordings of meetings. In this paper, we argue that such annotations are essential and effective to allow the retrieval of relevant information from large audio-visual databases. The paper proposes several useful annotations that can be derived from cheap and unobtrusive sensors. It also describes a hardware platform designed to implement accelerometric activity recognition, outlines approaches to extract annotations and presents first experimental results.


Activity recognition Meeting annotations Speaker segmentation Wearable sensing 


  1. 1.
    Lamming M, Flynn M (1994) Forget-me-not: intimate computing in support of human memory. In: Proceedings of the Friends 21, Tokyo, Japan, 2–4 February 1994Google Scholar
  2. 2.
    Dey AK, Salber D, Abowd GD and Futakawa M (1999) The conference assistant: combining context-awareness with wearable computing. Proceedings of the 3rd International Symposium on Wearable Computers, San Francisco, CA, 18–19 October 1999Google Scholar
  3. 3.
    Rhodes B (1997) The wearable remembrance agent: a system for augmented memory. In: Proceedings of the ISWC, Cambridge, MA, 13–14 October 1997Google Scholar
  4. 4.
    ICSI Berkeley (2003) The Meeting Recorder Project at ICSI. Cited 10 July 2003Google Scholar
  5. 5.
    Janin A, Morgan N (2001) Speechcorder, the portable meeting recorder. In: Proceedings of the Workshop on Hands-Free Speech Communication, Kyoto, Japan, 9–11 April 2001Google Scholar
  6. 6.
    Waibel A, Bett M and Finke M (1998) Meeting browser: tracking and summarizing meetings. In Proceedings of the DARPA Broadcast News Workshop, Herndon, VA, 28 February–3 March 1998Google Scholar
  7. 7.
    T. Kontzer (2001) Recording your life. Cited 18 December 2001Google Scholar
  8. 8.
    Ueoka R, Hirose M, Hirota K, Hiyama A and Yamamura A (2001) Study of experience recording and recalling for wearable computer. Corr Hum Interf 3(1):13–16Google Scholar
  9. 9.
    Healey J, Picard R (1998) Startlecam: a cybernetic wearable camera. In Proceedings of the 2nd International Symposium on wearable computers, Pittsburgh, PA, 19–20 October, 1998Google Scholar
  10. 10.
    Mann S (1997) Smart clothing: the wearable computer and wearcam. Personal Technologies 1(1):23–24Google Scholar
  11. 11.
    Kimber D, Wilcox L (1996) Acoustic segmentation for audio browsers. In Proceedings of the Interface Conference, Sydney, Australia, July 1996Google Scholar
  12. 12.
    Delacourt P, Wellekens C (1999) Detection of speaker changes in an audio document. In Proceedings of Eurospeech, Budapest, Hungary, September 1999Google Scholar
  13. 13.
    Choudhury T, Clarkson B, Jebara T and Pentland A (1999) Multimodal person recognition using unconstrained audio and video. In Proceedings of AVBPA, Washington, DC, 22–23 March 1999Google Scholar
  14. 14.
    Wilcox L, Kimber D and Chen F (1994) Audio indexing using speaker identification. In Proceedings of Eurospeech, Berlin, Germany, October 1994Google Scholar
  15. 15.
    Young S, Odell J, Ollason D, Valtchev V and Woodland P (1995) The HTK Book. Cambridge Research Lab, Cambridge, UKGoogle Scholar
  16. 16.
    Yoshida Y, Yonezawa Y, Sata K, Ninomiya I, and Caldwell WM (2000) A wearable posture, behavior and activity recording system. In Proceedings of the 2000 IEEE Engineering in Medicine and Biology Society 22nd Annual International Conference, Chicago, IL, 23-28 July 2000Google Scholar
  17. 17.
    Sekine M, Tamura T, Fujimoto T and Fukui Y (2000) Classification of walking pattern using acceleration waveform in elderly people. Engin Med Biol Soc 2:1356–1359Google Scholar
  18. 18.
    Schmidt A, Aidoo KA, Takaluoma A, Tuomela U, van Laerhoven K and van de Velde W (1999) Advanced interaction in context. HUC 1707:89–101Google Scholar
  19. 19.
    Hinckley K, Pierce J, Sinclair M and Horvitz E (2000) Sensing techniques for mobile interaction. User Int Soft Technol 2(2):91–100Google Scholar
  20. 20.
    Perng JK, Fisher B, Hollar S and Pister KSJ (1999) Acceleration sensing glove (asg). In Proceedings of the 3rd International Symposium on Wearable Computers, San Francisco, CA, 18–19 October 1999Google Scholar
  21. 21.
    Mantyjarvi J, Himberg J and Seppanen T (2001) Recognizing human motion with multiple acceleration sensors. Sys Man Cybern 747–752Google Scholar
  22. 22.
    Randell C, Muller H (2000) Context awareness by analysing accelerometer data. In: Proceedings of the 4th International Symposium on Wearable Computers, Atlanta, GA, 18–21 October 2000Google Scholar
  23. 23.
    van Laerhoven K, Aido K and Lowette S (2001) Real-time analysis of data from many sensors with neural networks. In Proceedings of the 5th International Symposium on Wearable Computers, Zurich, Switzerland, 8–9 October 2001Google Scholar
  24. 24.
    Leathers D (1997) Successful nonverbal communication: principles and applications. Allyn & Bacon, New YorkGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2003

Authors and Affiliations

  • Nicky Kern
    • 1
    Email author
  • Bernt Schiele
    • 1
  • Holger Junker
    • 2
  • Paul Lukowicz
    • 2
  • Gerhard Tröster
    • 2
  1. 1.Perceptual Computing and Computer VisionSwiss Federal Institute of Technology (ETH)ZurichSwitzerland
  2. 2.Wearable Computing LabSwiss Federal Institute of Technology (ETH)ZurichSwitzerland

Personalised recommendations