Advertisement

Interactive Augmented Omnidirectional Video with Realistic Lighting

  • Nick MichielsEmail author
  • Lode Jorissen
  • Jeroen Put
  • Philippe Bekaert
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8853)

Abstract

This paper presents the augmentation of immersive omnidirectional video with realistically lit objects. Recent years have known a proliferation of real-time capturing and rendering methods of omnidirectional video. Together with these technologies, rendering devices such as Oculus Rift have increased the immersive experience of users. We demonstrate the use of structure from motion on omnidirectional video to reconstruct the trajectory of the camera. The position of the car is then linked to an appropriate \(360^{\circ }\) environment map. State-of-the-art augmented reality applications have often lacked realistic appearance and lighting. Our system is capable of evaluating the rendering equation in real-time, by using the captured omnidirectional video as a lighting environment. We demonstrate an application in which a computer generated vehicle can be controlled through an urban environment.

Keywords

Omnidirectional video Realistic lighting Product integral rendering Structure from motion 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Agusanto, K., Li, L., Chuangui, Z., Sing, N.W.: Photorealistic rendering for augmented reality using environment illumination. In: Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 208–216 (October 2003)Google Scholar
  2. 2.
    Arief, I., McCallum, S., Hardeberg, J.Y.: Realtime estimation of illumination direction for augmented reality on mobile devices. In: Color and Imaging Conference, pp. 111–116. IS&T and SID, Los Angeles, CA, USA (November 2012)Google Scholar
  3. 3.
    Crassin, C., Neyret, F., Sainz, M., Green, S., Eisemann, E.: Interactive indirect illumination using voxel-based cone tracing: An insight. In: ACM SIGGRAPH 2011 Talks, SIGGRAPH 2011, pp. 20:1–20:1. ACM, New York (2011). http://doi.acm.org/10.1145/2037826.2037853
  4. 4.
    Debevec, P.: Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1998, pp. 189–198. ACM, New York (1998)Google Scholar
  5. 5.
    Dumont, M., Rogmans, S., Maesen, S., Frederix, K., Taelman, J., Bekaert, P.: A spatial immersive office environment for computer-supported collaborative work - moving towards the office of the future. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques, SIGMAP 2011, pp. 212–216 (2011)Google Scholar
  6. 6.
    Ferrari, S., Maggioni, M., Borghese, N.: Multiscale approximation with hierarchical radial basis functions networks. IEEE Transactions on Neural Networks 15(1), 178–188 (2004)CrossRefGoogle Scholar
  7. 7.
    Fischler, M.A., Bolles, R.C.: Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM 24(6), 381–395 (1981)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Fraundorfer, F., Scaramuzza, D.: Visual odometry : Part ii: Matching, robustness, optimization, and applications. IEEE Robotics Automation Magazine 19(2), 78–90 (2012)CrossRefGoogle Scholar
  9. 9.
    Gierlinger, T., Danch, D., Stork, A.: Rendering techniques for mixed reality. Journal of Real-Time Image Processing 5(2), 109–120 (2010)CrossRefGoogle Scholar
  10. 10.
    Gorski, K., Hivon, E., Banday, A., Wandelt, B., Hansen, F., et al.: HEALPix - A Framework for high resolution discretization, and fast analysis of data distributed on the sphere. Astrophys. J. 622, 759–771 (2005)CrossRefGoogle Scholar
  11. 11.
    Grosch, T.: PanoAR: Interactive augmentation of omni-directional images with consistent lighting. In: Mirage 2005, Computer Vision / Computer Graphics Collaboration Techniques and Applications, pp. 25–34 (2005)Google Scholar
  12. 12.
    Grosch, T., Eble, T., Mueller, S.: Consistent interactive augmentation of live camera images with correct near-field illumination. In: Proceedings of the 2007 ACM Symposium on Virtual Reality Software and Technology, VRST 2007, pp. 125–132. ACM, New York (2007)Google Scholar
  13. 13.
    Haller, M.: Photorealism or/and non-photorealism in augmented reality. In: Proceedings of the 2004 ACM SIGGRAPH International Conference on Virtual Reality Continuum and Its Applications in Industry, VRCAI 2004, pp. 189–196. ACM, New York (2004). http://doi.acm.org/10.1145/1044588.1044627
  14. 14.
    Jones, B.R., Benko, H., Ofek, E., Wilson, A.D.: Illumiroom: Peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2013, pp. 869–878. ACM, New York (2013). http://doi.acm.org/10.1145/2470654.2466112
  15. 15.
    Kajiya, J.T.: The rendering equation. In: Proceedings of the 13th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1986, pp. 143–150. ACM, New York (1986)Google Scholar
  16. 16.
    Kanbara, M., Yokoya, N.: Real-time estimation of light source environment for photorealistic augmented reality. In: Proceedings of the 17th International Conference on Pattern Recognition, ICPR 2004, vol. 2, pp. 911–914 (August 2004)Google Scholar
  17. 17.
    Lam, P.M., Ho, T.Y., Leung, C.S., Wong, T.T.: All-frequency lighting with multiscale spherical radial basis functions. IEEE Trans. Vis. Comput. Graph. 16(1), 43–56 (2010). http://dblp.uni-trier.de/db/journals/tvcg/tvcg16.html
  18. 18.
    Lourakis, M.I.A., Argyros, A.A.: Sba: A software package for generic sparse bundle adjustment. ACM Trans. Math. Softw. 36(1), 2:1–2:30 (2009). http://doi.acm.org/10.1145/1486525.1486527
  19. 19.
    Lowe, D.G.: Object recognition from local scale-invariant features. In: Proceedings of the International Conference on Computer Vision, ICCV 1999, vol. 2, p. 1150. IEEE Computer Society, Washington, DC (1999). http://dl.acm.org/citation.cfm?id=850924.851523
  20. 20.
    Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the 7th International Joint Conference on Artificial Intelligence, IJCAI 1981, vol. 2, pp. 674–679. Morgan Kaufmann Publishers Inc., San Francisco (1981). http://dl.acm.org/citation.cfm?id=1623264.1623280
  21. 21.
    Ng, R., Ramamoorthi, R., Hanrahan, P.: All-frequency shadows using non-linear wavelet lighting approximation. ACM Trans. Graph. 22(3), 376–381 (2003)CrossRefGoogle Scholar
  22. 22.
    Ng, R., Ramamoorthi, R., Hanrahan, P.: Triple product wavelet integrals for all-frequency relighting. In: ACM SIGGRAPH 2004 Papers, SIGGRAPH 2004, pp. 477–487. ACM, New York (2004)Google Scholar
  23. 23.
    Papagiannakis, G., Foni, A., Magnenat-Thalmann, N.: Practical precomputed radiance transfer for mixed reality. In: Proceedings of Virtual Systems and Multimedia 2005, pp. 189–199. VSMM Society (2005)Google Scholar
  24. 24.
    Raskar, R., Welch, G., Cutts, M., Lake, A., Stesin, L., Fuchs, H.: The office of the future: A unified approach to image-based modeling and spatially immersive displays. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1998, pp. 179–188. ACM, New York (1998). http://doi.acm.org/10.1145/280814.280861
  25. 25.
    Scaramuzza, D., Fraundorfer, F.: Visual odometry : Part i - the first 30 years and fundamentals. IEEE Robotics Automation Magazine 18(4) (2011)Google Scholar
  26. 26.
    Shi, J., Tomasi, C.: Good features to track. In: 1994 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 1994, pp. 593–600 (June 1994)Google Scholar
  27. 27.
    Sloan, P.P., Kautz, J., Snyder, J.: Precomputed radiance transfer for real-time rendering in dynamic, low-frequency lighting environments. In: Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2002, pp. 527–536. ACM, New York (2002)Google Scholar
  28. 28.
    Tsai, Y.T., Shih, Z.C.: All-frequency precomputed radiance transfer using spherical radial basis functions and clustered tensor approximation. In: ACM SIGGRAPH 2006 Papers, SIGGRAPH 2006, pp. 967–976. ACM, New York (2006). http://doi.acm.org/10.1145/1179352.1141981
  29. 29.
    Vr, O.: Oculus rift - virtual reality headset for 3d gaming (2012). http://www.oculusvr.com/ (accessed May 7, 2014)
  30. 30.
    Wang, J., Ren, P., Gong, M., Snyder, J., Guo, B.: All-frequency rendering of dynamic, spatially-varying reflectance. In: ACM SIGGRAPH Asia 2009 Papers, SIGGRAPH Asia 2009, pp. 133:1–133:10. ACM, New York (2009). http://doi.acm.org/10.1145/1661412.1618479

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Nick Michiels
    • 1
    Email author
  • Lode Jorissen
    • 1
  • Jeroen Put
    • 1
  • Philippe Bekaert
    • 1
  1. 1.Expertise Centre for Digital MediaHasselt University - tUL - iMindsDiepenbeekBelgium

Personalised recommendations