Capturing Industrial Machinery into Virtual Reality

  • Jeroen Put
  • Nick Michiels
  • Fabian Di FioreEmail author
  • Frank Van Reeth
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10945)


In this paper we set out to find a new technical and commercial solution to easily acquire a virtual model of existing machinery for visualisation in a VR environment. To this end we introduce an image-based scanning approach with an initial focus on a monocular (handheld) capturing device such as a portable camera. Poses of the camera will be estimated with a Simultaneous Localisation and Mapping technique. Depending on the required quality offline calibration is incorporated by means of ArUco markers placed within the captured scene. Once the images are captured, they are compressed in a format that allows rapid low-latency streaming and decoding on the GPU. Finally, upon viewing the model in a VR environment, an optical flow method is used to interpolate between the triangulisation of the captured viewpoints to deliver a smooth VR experience. We believe our tool will facilitate the capturing of machinery into VR providing a wide range of benefits such as doing marketing, providing offsite help and performing remote maintenance.


Digitising and scanning View interpolation Virtual reality 



This research was partially supported by Flanders Make, the strategic research centre for the manufacturing industry, in view of the Flanders Make FLEXAS_VR project.

We also gratefully express our gratitude to the European Fund for Regional Development (ERDF) and the Flemish Government, which are kindly funding part of the research at the Expertise Centre for Digital Media.


  1. 1.
    Choi, S.S., Jung, K., Noh, S.D.: Virtual reality applications in manufacturing industries: past research, present findings, and future directions. Concurrent Eng. 23(1), 40–63 (2015)CrossRefGoogle Scholar
  2. 2.
    Wang, X., Ong, S.K., Nee, A.Y.C.: A comprehensive survey of augmented reality assembly research. Adv. Manufact. 4(1), 1–22 (2016)CrossRefGoogle Scholar
  3. 3.
    Werrlich, S., Nitsche, K., Notni, G.: Demand analysis for an augmented reality based assembly training. In: Proceedings of the 10th International Conference on Pervasive Technologies Related to Assistive Environments, PETRA 2017, pp. 416–422. ACM, New York (2017)Google Scholar
  4. 4.
    Grajewski, D., Górski, F., Zawadzki, P., Hamrol, A.: Application of virtual reality techniques in design of ergonomic manufacturing workplaces. In: Proceedings of 2013 International Conference on Virtual and Augmented Reality in Education. Procedia Computer Science, vol. 25, pp. 289–301 (2013)CrossRefGoogle Scholar
  5. 5.
    Pontonnier, C., Dumont, G., Samani, A., Madeleine, P., Badawi, M.: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions. J. Multimodal User Interfaces 8(2), 199–208 (2014)CrossRefGoogle Scholar
  6. 6.
    McMillan, L., Bishop, G.: Plenoptic modeling: an image-based rendering system. In: Proceedings of the 22 nd Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1995, pp. 39–46. ACM, New York (1995)Google Scholar
  7. 7.
    Mortensen, J.: Virtual light fields for global illumination in computer graphics. Ph.D. thesis (2011)Google Scholar
  8. 8.
    Buehler, C., Bosse, M., McMillan, L., Gortler, S., Cohen, M.: Unstructured lumigraph rendering. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2001, pp. 425–432. ACM, New York (2001)Google Scholar
  9. 9.
    Davis, A., Levoy, M., Durand, F.: Unstructured light fields. Comput. Graph. Forum 31(2pt1), 305–314 (2012)CrossRefGoogle Scholar
  10. 10.
    Raptis, G.E., Katsini, C., Fidas, C., Avouris, N.: Effects of image-based rendering and reconstruction on game developers efficiency, game performance, and gaming experience. In: Bernhaupt, R., Dalvi, G., Joshi, A., Balkrishan, D.K., O’Neill, J., Winckler, M. (eds.) INTERACT 2017. LNCS, vol. 10514, pp. 87–96. Springer, Cham (2017). Scholar
  11. 11.
    OpenSLAM. World Wide Web (2018).
  12. 12.
  13. 13.
    Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo and RGB-D cameras. CoRR, abs/1610.06475 (2016)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Jeroen Put
    • 1
  • Nick Michiels
    • 1
  • Fabian Di Fiore
    • 1
    Email author
  • Frank Van Reeth
    • 1
  1. 1.Expertise Centre for Digital MediaHasselt University - tUL - Flanders MakeDiepenbeekBelgium

Personalised recommendations