Advertisement

Personalized Life Log Media System in Ubiquitous Environment

  • Ig-Jae Kim
  • Sang Chul Ahn
  • Hyoung-Gon Kim
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4412)

Abstract

In this paper, we propose new system for storing and retrieval of personal life log media on ubiquitous environment. We can gather personal life log media from intelligent gadgets which are connected with wireless network. Our intelligent gadgets consist of wearable gadgets and environment gadgets. Wearable gadgets include audiovisual device, GPS, 3D-accelerometer and physiological reaction sensors. Environment gadgets include the smart sensors attached to the daily supplies, such as cup, chair, door and so on. User can get multimedia stream with wearable intelligent gadget and also get the environmental information around him from environment gadgets as personal life log media. These life log media(LLM) can be logged on the LLM server in realtime. In LLM server, learning-based activity analysis engine will process logged data and create meta data for retrieval automatically. By using proposed system, user can log with personalized life log media and can retrieve the media at any time. To give more intuitive retrieval, we provide intuitive spatiotemporal graphical user interface in client part. Finally we can provide user-centered service with individual activity registration and classification for each user with our proposed system.

Keywords

life log system spatiotemporal interface activity analysis 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aizawa, K., et al.: Efficient Retrieval of Life Log Based on Context and Content. In: ACM Workshop CARPE 2004, pp. 22–31. ACM Press, New York (2004)Google Scholar
  2. 2.
    Gemmell, J., et al.: Passive Capture and Ensuing Issues for a Personal Lifetime Store. In: ACM Workshop CARPE 2004, pp. 48–55. ACM Press, New York (2004)Google Scholar
  3. 3.
    Mann, S.: Continuous Lifelong Capture of Personal Experience with EyeTap. In: ACM Workshop CARPE 2004, pp. 1–21. ACM Press, New York (2004)Google Scholar
  4. 4.
    Chambers, G., et al.: Hierarchical recognition of intentional human gestures for sports video annotation. In: Proceeding of IEEE Conference on Pattern Recognition, vol. 2, pp. 1082–1085. IEEE Computer Society Press, Los Alamitos (2002)Google Scholar
  5. 5.
    Torralba, A., et al.: Context-based vision system for place and object recognition. In: Proceedings of International Conference on Computer Vision (2003)Google Scholar
  6. 6.
    Vemuri, S., et al.: An audio-based personal memory aid. In: Proceedings of Ubicomp 2004, Ubiquitous Computing, pp. 400–417 (2004)Google Scholar
  7. 7.
    Kern, N., Schiele, B., Schmidt, A.: Multi-Sensor Activity Context Detection for Wearable Computing. In: European Symposium on Ambient Intelligence (2003)Google Scholar
  8. 8.
    Randell, C., Muller, H.: Context awareness by analyzing accelerometer data. In: Proceeding of Fourth International Symoposium on Wearable Computers 2000, pp. 175–176 (2000)Google Scholar
  9. 9.
    Tancharoen, D., Yamasaki, T., Aizawa, K.: Practical Experience Recording and Indexing of Life Long Video. In: ACM Workshop CARPE 2005, pp. 61–66. ACM Press, New York (2005)Google Scholar
  10. 10.
    Kern, N., et al.: Context Annotation for a Live Life Recording. In: Proceeding of Pervasive (2004)Google Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Ig-Jae Kim
    • 1
  • Sang Chul Ahn
    • 1
  • Hyoung-Gon Kim
    • 1
  1. 1.Imaging Media Research Center, KIST, 39-1, Hawolgokdong, Seongbukgu, SeoulKorea

Personalised recommendations