Skip to main content
Log in

U-DiVE - design and evaluation of a distributed photorealistic virtual reality environment

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Mobile devices such as smartphones are increasingly being used for immersive content consumption, mostly involving 360 video and 3D audio media delivery. However these smartphones, especially low-cost ones, are still not able to provide the processing and battery power needed for a real-time rendering, visualization and interaction with photorealistic virtual reality scenes. In this context, this paper proposes and evaluates U-DiVE (Unity-based Distributed Virtual Reality Environment), a framework that decouples the processing and rendering processes from the delivery, visualization and interaction with realistic VR models. The U-DiVE framework produces a photorealistic scene using a general ray-tracing algorithm and a virtual reality camera configured to use barrel shaders to correct the lens distortion, allowing the visualization through inexpensive smartphone-based head-mount displays. The framework also includes a method to obtain the smartphone’s spatial orientation to control the user’s field of view, which is delivered via real-time WebRTC streaming. The analysis show U-DiVE allows for the real-time visualization and manipulation of realistic, immersive scenes via smartphone-based, low-cost head-mounted displays, with low end-to-end latency, considering the required continuous data processing and delivery.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data Availability

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

Notes

  1. https://cutt.ly/BjhXlvS; Online, accessed in 02/01/2021.

  2. Quaternions are defined by four components x, y, z, w and used to represent rotations.

References

  1. Ahmadi H, Eltobgy O, Hefeeda M (2017) Adaptive multicast streaming of virtual reality content to mobile users. In: Proceedings of the on thematic workshops of ACM multimedia 2017, pp 170–178

  2. Bergkvist A, Burnett DC, Jennings C, Narayanan A, Aboba B (2012) Webrtc 1.0: Real-time communication between browsers. Working draft W3C, vol 91

  3. Brooks FP (1999) What’s real about virtual reality? IEEE Comput Graph Appl 19(6):16–27

    Article  Google Scholar 

  4. Csongei M, Hoang L, Sandor C, Lee YB (2014) Global illumination for augmented reality on mobile phones. IEEE, Piscataway

    Book  Google Scholar 

  5. Elbamby MS, Perfecto C, Bennis M, Doppler K (2018) Toward low-latency and ultra-reliable virtual reality. IEEE Netw 32(2):78–84

    Article  Google Scholar 

  6. El-Ganainy T, Hefeeda M (2016) Streaming virtual reality content. arXiv:1612.08350

  7. Freniere ER, Tourtellott J (1997) Brief history of generalized ray tracing. In: Lens design, illumination, and optomechanical modeling, vol 3130. International society for optics and photonics, pp 170–178

  8. Friston S, Steed A (2014) Measuring latency in virtual environments. IEEE Trans Vis Comput Graph 20(4):616–625

    Article  Google Scholar 

  9. Glassner AS (1989) An introduction to ray tracing. Elsevier, Amsterdam

    MATH  Google Scholar 

  10. Godoy AP, Teixeira CA (2014) An architecture to promote the use of mobile devices on interactions with media synthesized remotely. In: Proceedings of the 20th Brazilian symposium on multimedia and the web, pp 143–150

  11. Gunkel SN, Hindriks R, Assal KME, Stokking HM, Dijkstra-Soudarissanane S, Haar FT, Niamut O (2021) Vrcomm: an end-to-end web system for real-time photorealistic social vr communication. In: Proceedings of the 12th ACM multimedia systems conference, pp 65–79

  12. Gunkel SN, Potetsianakis E, Klunder TE, Toet A, Dijkstra-Soudarissanane S (2022) Immersive experiences and xr: a game engine or multimedia streaming problem? arXiv:2201.05552

  13. He D, Liu F, Pape D, Dawe G, Sandin D (2000) Video-based measurement of system latency. In: International immersive projection technology workshop, vol 111. Citeseer

  14. He J, Qureshi MA, Qiu L, Li J, Li F, Han L (2018) Rubiks: practical 360-degree streaming for smartphones. In: Proceedings of the 16th annual international conference on mobile systems, applications, and services, pp 482–494

  15. I 23009-1 (2014) Information technology - dynamic adaptive streaming over http (dash) - part 1: Media presentation description and segment formats

  16. Jacobs MC, Livingston MA, State A (1997) Managing latency in complex augmented reality systems. In: Proceedings of the 1997 symposium on Interactive 3D graphics, pp 49–ff

  17. Jung GS, Jung SK (2006) A streaming engine for pc-based 3d network games onto heterogeneous mobile platforms. In: International conference on technologies for e-learning and digital entertainment. Springer, pp 797–800

  18. Junior WC, Pereira LT, Moreno MF, Silva RL (2020) Photorealism in low-cost virtual reality devices. In: 2020 22Nd symposium on virtual and augmented reality (SVR). IEEE, pp 406–412

  19. Lee W-J, Hwang SJ, Shin Y, Yoo J-J, Ryu S (2017) Fast stereoscopic rendering on mobile ray tracing gpu for virtual reality applications. In: 2017 IEEE International conference on consumer electronics (ICCE). IEEE, pp 355–357

  20. Lee W-J, Shin Y, Lee J, Lee S, Ryu S, Kim J (2013) Real-time ray tracing on future mobile computing platform. In: SIGGRAPH Asia 2013 symposium on mobile graphics and interactive applications, pp 1–5

  21. Li Z-N, Drew MS, Liu J (2004) Fundamentals of multimedia. Springer, Berlin

    MATH  Google Scholar 

  22. Liang J, Shaw C, Green M (1991) On temporal-spatial realism in the virtual reality environment. In: Proceedings of the 4th annual ACM symposium on user interface software and technology, pp 19–25

  23. Loreto S, Romano SP (2012) Real-time communications in the web: issues, achievements, and ongoing standardization efforts. IEEE Internet Comput 16(5):68–73

    Article  Google Scholar 

  24. Miller D, Bishop G (2002) Latency meter: a device for easily monitoring ve delay. In: Proceedings of SPIE, vol 4660

  25. Mine MR (1993) Characterization of end-to-end delays in head-mounted display systems. The University of North Carolina at Chapel Hill, TR93-001

  26. Pantos R, May W (2017) HTTP live streaming. RFC 8216

  27. Prakash S, Bahremand A, Nguyen LD, LiKamWa R (2019) Gleam: an illumination estimation framework for real-time photorealistic augmented reality on mobile devices. In: Proceedings of the 17th annual international conference on mobile systems, applications, and services, pp 142–154

  28. Prins MJ, Gunkel SN, Stokking HM, Niamut OA (2018) Togethervr: a framework for photorealistic shared media experiences in 360-degree vr. SMPTE Motion Imaging J 127(7):39–44

    Article  Google Scholar 

  29. Rohmer K, Büschel W, Dachselt R, Grosch T (2014) Interactive near-field illumination for photorealistic augmented reality on mobile devices. In: 2014 IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, pp 29–38

  30. Salehi S, Alnajim A, Zhu X, Smith M, Shen C-C, Cimini L (2020) Traffic characteristics of virtual reality over edge-enabled wi-fi networks. arXiv:2011.09035

  31. Schulzrinne H, Casner SL, Frederick R, Jacobson V (2003) RTP: a transport protocol for real-time applications. RFC 3550

  32. Steed A (2008) A simple method for estimating the latency of interactive, real-time graphics simulations. In: Proceedings of the 2008 ACM symposium on Virtual reality software and technology, pp 123–129

  33. Whitted T (1998) An improved illumination model for shaded display. Association for Computing Machinery, New York, pp 119–125

    Google Scholar 

  34. Yoo S, Kay J (2016) Vrun: running-in-place virtual reality exergame. In: Proceedings of the 28th australian conference on computer-human interaction, pp 562–566

Download references

Funding

This research received no external funding.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Rodrigo L. S. Silva.

Ethics declarations

Conflict of Interests

The authors declare no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Roberti Jr., W.C., Pereira, L.T., Silva, R.L.S. et al. U-DiVE - design and evaluation of a distributed photorealistic virtual reality environment. Multimed Tools Appl 82, 34129–34145 (2023). https://doi.org/10.1007/s11042-023-15064-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-023-15064-y

Keywords

Navigation