Skip to main content

VR-Based Interface Enabling Ad-Hoc Individualization of Information Layer Presentation

  • Conference paper
  • First Online:
HCI International 2021 - Late Breaking Posters (HCII 2021)

Abstract

Graphical user interfaces created for scientific prototypes are often designed to support only a specific and well-defined use case. They often use two-dimensional overlay buttons and panels in the view of the operator to cover needed functionalities. For potentially unpredictable and more complex tasks, such interfaces often fall short of the ability to scale properly with the larger amount of information that needs to be processed by the user. Simply transferring this approach to more complex use-cases likely introduces visual clutter and leads to an unnecessarily complicated interface navigation that reduces accessibility and potentially overwhelms users. In this paper, we present a possible solution to this problem. In our proposed concept, information layers can be accessed and displayed by placing an augmentation glass in front of the virtual camera. Depending on the placement of the glass, the viewing area can cover only parts of the view or the entire scene. This also makes it possible to use multiple glasses side by side. Furthermore, augmentation glasses can be placed into the virtual environment for collaborative work. With this, our approach is flexible and can be adapted very fast to changing demands.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zaker, R., Coloma, E.: Virtual reality-integrated workflow in BIM enabled projects collaboration and design review: a case study. J. Visual. Eng. 6(4), 1–15 (2018)

    Google Scholar 

  2. Sonsalla, R., et al.: Field testing of a cooperative multi-robot sample return mission in Mars analogue environment. In: Proceedings of the 14th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA-2017), Leiden, the Netherlands (2017)

    Google Scholar 

  3. Bonin-Font, F., Massot-Campos, M., Burguera, A.: ARSEA: a virtual reality subsea exploration assistant. IFAC-PapersOnLine 51(29), 26–31 (2018)

    Article  Google Scholar 

  4. Planthaber, S., et al.: Controlling a semi-autonomous robot team from a virtual environment. In: Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI 2017), p. 417. Association for Computing Machinery, New York (2017). https://doi.org/10.1145/3029798.3036647

  5. Taubert, D., et al.: METERON SUPVIS – an operations experiment to prepare for future human/robot missions on the moon and beyond. In: Proceedings of the 14th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA-2017), Leiden, The Netherlands (2017)

    Google Scholar 

  6. Martin, S., Rinnan, T.B., Sarkarati, M., Nergaard, K.: The surface operations Framework – transitioning from early analogue experiments to future lunar missions. In: Proceedings of the 15th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA-2019), Noordwijk, The Netherlands (2019)

    Google Scholar 

  7. Imhof, B., et al.: Moonwalk—human robot collaboration mission scenarios and simulations. In: Proceedings of the AIAA SPACE 2015 Conference and Exposition, Pasadena, p. 4531 (2015)

    Google Scholar 

  8. Kirchner, E.A., et al.: An intelligent man-machine interface—multi-robot control adapted for task engagement based on single-trial detectability of P300. Front. Hum. Neurosci. Front. 10, 291 (2016). https://doi.org/10.3389/fnhum.2016.00291

  9. Mallwitz, M., et al.: The CAPIO active upper body exoskeleton and its application for teleoperation. In: Proceedings of the 13th Symposium on Advanced Space Technologies in Robotics and Automation, (ASTRA-2015), Noordwijk, The Netherlands (2015)

    Google Scholar 

  10. Linde YouTube Video: Virtual Reality Training for Operators by Linde (2018). https://www.youtube.com/watch?v=KYK6wuFaES8. Accessed 24 Mar 2021

  11. Vixel.no Homepage. https://www.vrex.no/vrex-information/howitworks/. Accessed 24 Mar 2021

  12. García, A.S., et al.: Collaborative virtual reality platform for visualizing space data and mission planning. Multimedia Tools Appl. 78(23), 33191–33220 (2019). https://doi.org/10.1007/s11042-019-7736-8

    Article  Google Scholar 

  13. Abercrombie, S.P., et al.: OnSight: Multi-platform visualization of the surface of mars. Poster presented at the 2017 American Geophysical Union Fall Meeting, New Orleans, LA (2017). https://agu.confex.com/agu/fm17/mediafile/Handout/Paper246353/ED11C-0134-onsight-agu-web.pdf

  14. Kim, Y., Kim, H., Kim, Y.O.: Virtual reality and augmented reality in plastic surgery: a review. Arch. Plast. Surg. 44(3), 179–187 (2017). https://doi.org/10.5999/aps.2017.44.3.179

  15. Kirchner, E.A., Langer, H., Beetz, M.: An interactive strategic mission management system for intuitive human-robot cooperation. In: Kirchner, F., Straube, S., Kühn, D., Hoyer, N. (eds.) AI Technology for Underwater Robots. ISCASE, vol. 96, pp. 183–193. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-30683-0_16

    Chapter  Google Scholar 

  16. Unreal Engine Website. https://www.unrealengine.com. Accessed 25 Mar 2021

Download references

Acknowledgements

The presented work is part of the projects TransFIT and KiMMI-SF which are funded by the German Aerospace Center (DLR) with federal funds of the Federal Ministry of Economics and Technology in accordance with the parliamentary resolution of the German Parliament under grant no. 50 RA 1701 (TransFIT) and 50 RA 2021 (KiMMI-SF).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Maurus .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jacke, L., Maurus, M., Kirchner, E.A. (2021). VR-Based Interface Enabling Ad-Hoc Individualization of Information Layer Presentation. In: Stephanidis, C., Antona, M., Ntoa, S. (eds) HCI International 2021 - Late Breaking Posters. HCII 2021. Communications in Computer and Information Science, vol 1498. Springer, Cham. https://doi.org/10.1007/978-3-030-90176-9_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-90176-9_42

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-90175-2

  • Online ISBN: 978-3-030-90176-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics