Querying Multiple Video Streams and Hypermedia Objects of a Video-Based Virtual Space System

  • Yusuke Yokota
  • Shumian He
  • Yahiko Kambayashi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3081)


By using several digital video cameras with omni-directional sensors, complete views of the activities at some place can be recorded as omni-directional videos. We utilize this technique for developing the Retrax System, which provides video-based virtual space environments as part of the Digital City infrastructure. The system aims to realize applications that are not possible with conventional video database systems, such as human activities and communications on computer networks based on recorded video data. This system archives the activities at some place in the real world and provides a virtual space based on the archived data for collaboration among users, such as walking, annotating, and communication. The user interface of the system bridges the real and virtual spaces or past and current human activities. This paper describes the purpose and usage of the system, and then introduces its architecture, data models, and the query language. Because both video data and hypermedia data are essential to the system, a query mechanism dealing with both types of data is required. The proposed query language is designed as an extension of SQL and realizes flexible management of both types of data.


Video Data Query Language Database Server Virtual Space Control Server 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Santini, S., Gupta, A., Jain, R.: Querying multiple perspective video. In: Proc. of SPIE, January 1999. Storage and Retrieval for Image and Video Databases VII, vol. 3656 (1999)Google Scholar
  2. 2.
    Chen, C.X., Zaniolo, C.: SQL ST: A Spatio-Temporal Data Model and Query Language. In: Laender, A.H.F., Liddle, S.W., Storey, V.C. (eds.) ER 2000. LNCS, vol. 1920, pp. 96–111. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  3. 3.
    Nelken, R., Francez, N.: Querying Temporal Databases Using Controlled Natural Language. In: The 18th International Conference on Computational Linguistics, COLING 2000 (2000)Google Scholar
  4. 4.
  5. 5.
    Libsie, M., Kosch, H.: Content adaptation of multimedia delivery and indexing using MPEG-7. In: Proc. of the 10th ACM international conference on Multimedia, December 2002, pp. 644–646 (2002)Google Scholar
  6. 6.
    Jensen, C.S., Clifford, J., Gadia, S.K., Segev, A., Snodgrass, R.T.: A Glossary of Temporal Database Concepts. ACM SIGMOD Record 21(3), 35–43 (1992)CrossRefGoogle Scholar
  7. 7.
    Bohlen, M.H., Jensen, C.S., Snodgrass, R.T.: Temporal Statement Modifiers. ACM Transactions on Database Systems (TODS) 25(4), 407–456 (2000)CrossRefGoogle Scholar
  8. 8.
    Chomicki, J., Toman, D., Bohlen, M.H.: Querying ATSQL databases with temporal logic. ACM Transactions on Database Systems (TODS) 26(2), 145–178 (2001)zbMATHCrossRefGoogle Scholar
  9. 9.
  10. 10.
    Konomi, S., Karuno, H.: Initial Experiences of ALAN-K: An Advanced LeArning Network in Kyoto. In: Proc. of 1st Conference on Creating, Connecting and Collaborating through Computing (C5 2003), January 2003, pp. 96–103 (2003)Google Scholar
  11. 11.

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Yusuke Yokota
    • 1
  • Shumian He
    • 1
  • Yahiko Kambayashi
    • 1
  1. 1.Graduate School of InformaticsKyoto UniversityKyotoJapan

Personalised recommendations