Advertisement

Director’s Cut - Analysis of Aspects of Interactive Storytelling for VR Films

  • Colm O. FearghailEmail author
  • Cagri Ozcinar
  • Sebastian Knorr
  • Aljosa Smolic
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11318)

Abstract

To explore methods that are currently used by professional virtual reality (VR) filmmakers to tell their stories and guide users, we analyze how end-users view \(360^\circ \) video in the presence of directional cues and evaluate if they are able to follow the actual story of narrative \(360^\circ \) films. In this context, we first collected data from five professional VR filmmakers. The data contains eight \(360^\circ \) videos, the directors cut, which is the intended viewing direction of the director, plot points and directional cues used for user guidance. Then, we performed a subjective experiment with 20 test subjects viewing the videos while their head orientation was recorded. Finally, we present and discuss the experimental results and show, among others, that visual discomfort and disorientation on part of the viewer not only lessen the immersive quality of the films but also cause difficulties in the viewer gaining a full understanding of the narrative that the director wished them to view.

Keywords

360\(^\circ \) film Storytelling Director’s cut Virtual reality 

Notes

Acknowledgment

This publication has emanated from research conducted with the financial support of Science Foundation Ireland (SFI) under the Grant Number 15/RP/2776.

References

  1. 1.
    Doom [PC CD-ROM], 10 December 1993Google Scholar
  2. 2.
    Bala, P., Dionisio, M., Nisi, V., Nunes, N.: IVRUX: a tool for analyzing immersive narratives in virtual reality. In: Nack, F., Gordon, A.S. (eds.) ICIDS 2016. LNCS, vol. 10045, pp. 3–11. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-48279-8_1CrossRefGoogle Scholar
  3. 3.
    Bolle, R., Aloimonos, Y., Fermüller, C.: Toward motion picture grammars. In: Chin, R., Pong, T.-C. (eds.) ACCV 1998. LNCS, vol. 1352, pp. 283–290. Springer, Heidelberg (1997).  https://doi.org/10.1007/3-540-63931-4_228CrossRefGoogle Scholar
  4. 4.
    Danieau, F., Guillo, A., Dore, R.: Attention guidance for immersive video content in head-mounted displays. In: 2017 IEEE Virtual Reality (VR), pp. 205–206. IEEE, Los Angeles, March 2017.  https://doi.org/10.1109/VR.2017.7892248
  5. 5.
    De Abreu, A., Ozcinar, C., Smolic, A.: Look around you: saliency maps for omnidirectional images in VR applications. In: Proceedings of the 9th International Conference on Quality of Multimedia Experience (QoMEX), pp. 1–6. IEEE, Erfurt, May 2017.  https://doi.org/10.1109/QoMEX.2017.7965634
  6. 6.
    The MPEG Virtual Reality Ad-hoc Group: Summary of survey on virtual reality. Technical report N16542, JTC1/SC29/WG11, ISO/IEC, Chengdu, CN, October 2016Google Scholar
  7. 7.
    Hillaire, S., Lecuyer, A., Cozot, R., Casiez, G.: Depth-of-field blur effects for first-person navigation in virtual environments. IEEE Comput. Graph. Appl. 28(6), 47–55 (2008).  https://doi.org/10.1109/MCG.2008.113CrossRefGoogle Scholar
  8. 8.
    Hoeg, E.R., Gerry, L.J., Thomsen, L., Nilsson, N.C., Serafin, S.: Binaural sound reduces reaction time in a virtual reality search task. In: 2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE), pp. 1–4, March 2017.  https://doi.org/10.1109/SIVE.2017.7901610
  9. 9.
    Justin (Director), L.: Help (2015). http://www.imdb.com/title/tt4794550/
  10. 10.
    Katz, S.D.: Film Directing Shot by Shot: Visualizing from Concept to Screen. Gulf Professional Publishing, Boston (1991)Google Scholar
  11. 11.
    Kjær, T., Lillelund, C.B., Moth-Poulsen, M., Nilsson, N.C., Nordahl, R., Serafin, S.: Can you cut it?: an exploration of the effects of editing in cinematic virtual reality. In: Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology, p. 4. ACM, Gothenburg, November 2017Google Scholar
  12. 12.
    Knorr, S., Ozcinar, C., O Fearghail, C., Smolic, A.: Director’s cut - a combined dataset for visual attention analysis in cinematic VR content. In: The 15th ACM SIGGRAPH European Conference on Visual Media Production, London, UK, December 2018Google Scholar
  13. 13.
    Kurby, C.A., Zacks, J.M.: Segmentation in the perception and memory of events. Trends Cogn. Sci. 12(2), 72–79 (2008)CrossRefGoogle Scholar
  14. 14.
    Lin, Y.C., Chang, Y.J., Hu, H.N., Cheng, H.T., Huang, C.W., Sun, M.: Tell me where to look: investigating ways for assisting focus in 360 video. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 2535–2545. ACM, Denver, May 2017Google Scholar
  15. 15.
    Loschky, L.C., Larson, A.M., Magliano, J.P., Smith, T.J.: What would jaws do? the tyranny of film and the relationship between gaze and higher-level narrative film comprehension. PloS one 10(11), e0142474 (2015)CrossRefGoogle Scholar
  16. 16.
    Mateer, J.: Directing for cinematic virtual reality: how traditional film director’s craft applies to immersive environments and notions of presence. J. Media Pract. (author-produced version) 18(1), 14–25 (2017).  https://doi.org/10.1080/14682753.2017.1305838CrossRefGoogle Scholar
  17. 17.
    Nielsen, L.T., et al.: Missing the point: an exploration of how to guide users’ attention during cinematic virtual reality. In: VRST 2016 Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, pp. 229–232. Munich, Germany, November 2016.  https://doi.org/10.1145/2993369.2993405
  18. 18.
    Ozcinar, C., Smolic, A.: Visual attention in omnidirectional video for virtual reality applications. In: 10th International Conference on Quality of Multimedia Experience (QoMEX), Sardinia, Italy, May 2018Google Scholar
  19. 19.
    Padmanaban, N., Ruban, T., Sitzmann, V., Norcia, A.M., Wetzstein, G.: Towards a machine-learning approach for sickness prediction in 360 stereoscopic videos. IEEE Trans. Visual. Comput. Graphics 24(4), 1594–1603 (2018)CrossRefGoogle Scholar
  20. 20.
    Pavel, A., Hartmann, B., Agrawala, M.: Shot orientation controls for interactive cinematography with 360 video. In: Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, pp. 289–297. ACM (2017)Google Scholar
  21. 21.
    Serrano, A., Sitzmann, V., Ruiz-Borau, J., Wetzstein, G., Gutierrez, D., Masia, B.: Movie editing and cognitive event segmentation in virtual reality video. ACM Trans. Graphics 36(4), 47:1–47:12 (2017).  https://doi.org/10.1145/3072959.3073668CrossRefGoogle Scholar
  22. 22.
    Smith, T.J.: An Attentional Theory of Continuity Editing (2006)Google Scholar
  23. 23.
    Smith, T.J., Levin, D., Cutting, J.E.: A window on reality: perceiving edited moving images. Curr. Dir. Psychol. Sci. 21(2), 107–113 (2012)CrossRefGoogle Scholar
  24. 24.
    Vosmeer, M., Schouten, B.: Project orpheus a research study into 360\({}^\circ \) cinematic VR. In: Proceedings of the 2017 ACM International Conference on Interactive Experiences for TV and Online Video, TVX 2017, pp. 85–90. ACM, New York (2017).  https://doi.org/10.1145/3077548.3077559

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Colm O. Fearghail
    • 1
    Email author
  • Cagri Ozcinar
    • 1
  • Sebastian Knorr
    • 1
    • 2
  • Aljosa Smolic
    • 1
  1. 1.V-SENSE, School of Computer Science and Statistics, Trinity College DublinThe University of DublinDublinIreland
  2. 2.Communication Systems GroupTechnical University of BerlinBerlinGermany

Personalised recommendations