Spaceline: A Concept for Interaction in Cinematic Virtual Reality

  • Sylvia RotheEmail author
  • Heinrich Hussmann
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11869)


Watching omnidirectional movies via head-mounted displays places the viewer inside the scene. In this way, the viewer attends an immersive movie experience. However, due to the free choice of the viewing direction, it is possible to miss details which are important for the story. On the other hand, the additional space component gives the filmmakers new opportunities to construct non-linear interactive stories. To assist this, we introduce the concept of a spaceline which connects movie sequences via interactive regions. This work explains the terms of the spaceline concept and introduces methods that make it easier for the viewer to follow the story, at their own pace with their own focus. We present a design space that supports filmmakers in designing interactive CVR experiences.


Cinematic Virtual Reality 360° movie Omnidirectional movies, timeline Interactivity Story structure, nonlinear storytelling 


  1. 1.
    Tikka, P.: (Interactive) cinema as a model of mind. Digit. Creat. 15(1), 14–17 (2004)CrossRefGoogle Scholar
  2. 2.
    In the Blink of a Mind—Attention – The Language of VR – Medium. Accessed 30 June 2018
  3. 3.
    Rothe, S., Buschek, D., Hußmann, H.: Guidance in cinematic virtual reality-taxonomy, research status and challenges. Multimodal Technol. Interact. 3(1), 19 (2019)CrossRefGoogle Scholar
  4. 4.
    Rothe, S., Pothmann, P., Drewe, H., Hussmann, H.: Interaction techniques for cinematic virtual reality. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 1733–1737 (2019)Google Scholar
  5. 5.
    Gustafson, S., Baudisch, P., Gutwin, C., Irani, P.: Wedge: clutter-free visualization of off-screen locations. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 787–796 (2008)Google Scholar
  6. 6.
    Gustafson, S.G., Irani, P.P.: Comparing visualizations for tracking off-screen moving targets. In: CHI 2007 Extended Abstracts on Human Factors in Computing Systems, pp. 2399–2404 (2007)Google Scholar
  7. 7.
    Zellweger, P.T., Mackinlay, J.D., Good, L., Stefik, M., Baudisch, P.: City lights. In: CHI 2003 Extended Abstracts on Human Factors in Computing Systems - CHI 2003, p. 838 (2003)Google Scholar
  8. 8.
    Hossain, Z. Hasan, K., Liang, H.-N., Irani, P.: EdgeSplit. In: Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services - MobileHCI 2012, p. 79 (2012)Google Scholar
  9. 9.
    Jo, H., Hwang, S., Park, H., Ryu, J.: Aroundplot: focus+context interface for off-screen objects in 3D environments. Comput. Graph. 35(4), 841–853 (2011)CrossRefGoogle Scholar
  10. 10.
    Gruenefeld, U., El Ali, A., Heuten, W., Boll, S.: Visualizing out-of-view objects in head-mounted augmented reality. In: Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI 2017, pp. 1–7 (2017)Google Scholar
  11. 11.
    Gruenefeld, U., Ennenga, D., El Ali, A., Heuten, W., Boll, S.: EyeSee360. In: Proceedings of the 5th Symposium on Spatial User Interaction - SUI 2017, pp. 109–118 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.LMU Munich UniversityMunichGermany

Personalised recommendations