A Novel Video Retrieval Method to Support a User’s Recollection of Past Events Aiming for Wearable Information playing
Our system supports a user's location-based recollection of past events with direct input such as in always ‘gazing’ video data, which allows the user to associate by simply looking at a viewpoint, and providing stable online and real-time video retrieval. We propose three functional methods: image retrieval with motion information, video scene segmentation, and real-time video retrieval. Our experimental results have shown that these functions are effective enough to perform wearable information playing.
KeywordsVideo Data Motion Information Video Retrieval Scene Change Video Scene
Unable to display preview. Download preview PDF.
- 1.M. Kidode: Advanced Computing and Communication Techniques for Wearable Information Playing (in Japanese). IEICE, SIG-PRMU2000-159, pp. 93–94, 2001.Google Scholar
- 2.M. Lamming, and M. Flynn: Forget-me-not: Intimate computing in support of human memory. In FRIENDS21: International Symposium on Next Generation Human Interface, pp. 125–128, 1994.Google Scholar
- 3.B.J. Rhodes: The Wearable Remembrance Agent: a System for Augmented Memory. Proc. ISWC’97, pp. 123–128, 1997.Google Scholar
- 4.B. Clarkson, and A. Pentland: Unsupervised Clustering of Ambulatory Audio and Videol. Proc. ICASSP99, 1999.Google Scholar
- 5.H. Aoki, B. Schiele, and A. Pentland: Realtime Personal Positioning System for Wearable Computers. Proc. ISWC’99, pp. 37–43, 1999.Google Scholar