Advertisement

VR Based Movie Watching Method by Reproduction of Spatial Sensation

  • Kunihiro Nishimura
  • Aoi Ito
  • Tomohiro Tanikawa
  • Michitaka Hirose
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5622)

Abstract

A conventional movie watching method is to view movies in front of a large screen such as theaters. Conventional presenting images in fixed position have a problem that it is easy for audiences to lose their spatial sensation of existing movies. In this paper, we propose a novel movie watching method in order to improve presence in existing media contents using virtual reality technology. We assumed when frames are presented with shooting angle based on audiences’ looking position, their presence will be much higher. To represent the camera-shooting angle, we used a optical flow method. We proposed a movie watching viewing method based on the reconstructed camera shooting angle which is presented with a moving projector or a wall screen. We thought that our method made it possible to reconstruct lost spatial in movies.

Keywords

Presence Camera Work Roaming Images a Moving Projector Spatial Sensation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hirose, M.: CABIN-A multiscreen display for computer experiments. In: 1997 Int. Conf. on Virtual Systems and MultiMedia, p. 78 (1997)Google Scholar
  2. 2.
    Chenm, S.E.: QickTime VR—an image-based approach to virtual environment navigation. In: Proc. SIGGRAPH 1995, pp. 29–38 (1995)Google Scholar
  3. 3.
    Sezliski, R., Shum, H.Y.: Creating full view panoramic image mosaics and environment maps. In: Proc. SIGGRAPH 1997, pp. 251–258 (1997)Google Scholar
  4. 4.
    DiVerdi, S., Wither, J., Hollerei, T.: Envisor: Online Environment Map Construction for Mixed Reality. In: IEEE Virtual Reality 2008, pp. 19–26 (2008)Google Scholar
  5. 5.
    Hoiem, D., Efros, A.A., Hebert, M.: Automatic photo pop-up. In: Proc. SIGGRAPH 2005, pp. 577–584 (2005)Google Scholar
  6. 6.
    Van de Hengel, A., Dick, A., Thormahlen, T., Ward, B., Torr, P.H.S.: VideoTrace: Rapid interactive scene modelling from video. In: Proc. SIGGRAPH 2007, p. 86 (2007)Google Scholar
  7. 7.
    Longuet-Higgis, H.C., Prazdny, K.: The interpretation of a moving retinal image. In: Proc. Roy. Soc. Lond., vol. B208, pp. 385–397 (1980)Google Scholar
  8. 8.
    Maybank, S.J.: The angular velocity associated with a optical flow field arising from motion through a rigid environment. Proc. Roy. Soc. Lond. A401, 317–326 (1985)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Brooks, M.J., Chojnacki, W., Baumera, L.: Determining the egomotion of an uncalibrated camera from instantaneous optical flow. J. Opt. Soc. Am., A 14-10, 2670–2677 (1997)CrossRefGoogle Scholar
  10. 10.
    Viéville, T., Faugeras, O.D.: The first order expansion of motion equations in the uncalibrated case. Computer Vision Image Understanding 64(1), 128–146 (1996)CrossRefGoogle Scholar
  11. 11.
    Arijon, D.: Grammar of the Film Language. Focal Press, London (1976)Google Scholar
  12. 12.
    Lucas, B., Kanade, T.: An Interative Image Registration Technique with an Application to Stereo Vision. In: Proc. The 7th International Joint Conf. on Artificial Intelligence, pp. 674–679 (1981)Google Scholar
  13. 13.
    Baron, J., Fleet, D., Beauchemin, S.: Performance of optical flow techniques. Int. J. Computer Vision 12(1), 43–77 (1994)CrossRefGoogle Scholar
  14. 14.
    Wyler, W.: Roman Holiday (1953)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Kunihiro Nishimura
    • 1
  • Aoi Ito
    • 2
  • Tomohiro Tanikawa
    • 1
  • Michitaka Hirose
    • 1
  1. 1.Graduate School of Information Science and TechnologyThe Univeristy of TokyoJapan
  2. 2.Graduate School of Interdisciplinary Information StudiesThe University of TokyoTokyoJapan

Personalised recommendations