Skip to main content
Log in

Towards Stereoscopic On-vehicle AR-HUD

  • Original article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

On-vehicle AR-HUD (head-up display) is a driving assistant system, which can let drivers see navigation and warning information directly through the windshield with an easy-to-perceive augmented reality form. Traditional AR-HUD can only produce the content at a specific distance, which causes uncomfortable experiences such as frequent refocusing and unnatural floating of virtual objects. This paper proposed an innovative AR-HUD system, which can provide stereoscopic scenes to the driver. The system is composed of two traditional HUD displays and supports parallax by additive light field factorization. Optical paths and illumination of two displays are precisely calibrated for both views to ensure the combination of lights is as expected. The factorization algorithm is optimized for the system. With GPU acceleration, the system can run in real time. The system is cheap and simple to transform from a traditional AR-HUD system, which presents a feasible scheme to achieve a fusion-enhanced augmented reality assistant for driving.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Notes

  1. https://www.mathworks.com/products/matlab.html.

References

  1. Bauer, A., Vo, S., Parkins, K., Rodriguez, F., Cakmakci, O., Rolland, J.P.: Computational optical distortion correction using a radial basis function-based mapping method. Opt. Express 20(14), 14906–14920 (2012). https://doi.org/10.1364/OE.20.014906

    Article  Google Scholar 

  2. Chiu, H.P., Murali, V., Villamil, R., Kessler, G.D., Samarasekera, S., Kumar, R.: Augmented reality driving using semantic geo-registration. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 423–430 (2018). https://doi.org/10.1109/VR.2018.8447560

  3. Coleman, T.F., Li, Y.: A reflective newton method for minimizing a quadratic function subject to bounds on some of the variables. Siam J Opt 6(4), 1040–1058 (1996). https://doi.org/10.1137/S1052623494240456

    Article  MathSciNet  MATH  Google Scholar 

  4. Gao, X., Werner, J., Necker, M., StorkX, W.: A calibration method for automotive augmented reality head-up displays based on a consumer-grade mono-camera. In: 2019 IEEE International Conference on Image Processing (ICIP), pp. 4355–4359 (2019). https://doi.org/10.1109/ICIP.2019.8803608

  5. Genc, Y., Sauer, F., Wenzel, F., Tuceryan, M., Navab, N.: Optical see-through hmd calibration: a stereo method validated with a video see-through system. In: Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000), pp. 165–174 (2000)

  6. Gershun, A.: Svetovoe pole (the light field). J. Math. Phys. (1936)

  7. Gershun, A.: The light field. J. Math. Phys. 18(1–4), 51–151 (1939)

    Article  Google Scholar 

  8. Huang, F.C., Chen, K., Wetzstein, G.: The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues. ACM Trans. Graph. (2015). https://doi.org/10.1145/2766922

    Article  Google Scholar 

  9. Huang, F.C., Wetzstein, G., Barsky, B.A., Raskar, R.: Eyeglasses-free display: towards correcting visual aberrations with computational light field displays. ACM Trans. Graphics (TOG) 33(4), 59 (2014). https://doi.org/10.1145/2601097.2601122

    Article  Google Scholar 

  10. Itoh, Y., Klinker, G.: Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization. In: 2014 IEEE Symposium on 3D User Interfaces (3DUI), pp. 75–82 (2014)

  11. Jun, H., Kim, G.: A calibration method for optical see-through head-mounted displays with a depth camera. In: 2016 IEEE Virtual Reality (VR), pp. 103–111 (2016). https://doi.org/10.1109/VR.2016.7504693

  12. Klemm, M., Seebacher, F., Hoppe, H.: High accuracy pixel-wise spatial calibration of optical see-through glasses. Comput. Graphics 64, 51–61 (2017). https://doi.org/10.1016/j.cag.2017.02.001

    Article  Google Scholar 

  13. Lagoo, R., Charissis, V., Harrison, D.K.: Mitigating driver’s distraction: automotive head-up display and gesture recognition system. IEEE Consum. Electron. Mag. 8(5), 79–85 (2019). https://doi.org/10.1109/MCE.2019.2923896

    Article  Google Scholar 

  14. Lawton, G.: 3d displays without glasses: coming to a screen near you. Computer 44(1), 17–19 (2011). https://doi.org/10.1109/MC.2011.3

    Article  Google Scholar 

  15. Lee, S., Jang, C., Moon, S., Cho, J., Lee, B.: Additive light field displays: realization of augmented reality with holographic optical elements. ACM Trans. Graphics (TOG) 35(4), 60 (2016). https://doi.org/10.1145/2897824.2925971

    Article  Google Scholar 

  16. Li, A., Wu, Y., Xia, X., Huang, Y., Feng, C., Zheng, Z.: Computational method for correcting complex optical distortion based on fov division. Appl. Opt. 54(9), 2441–2449 (2015). https://doi.org/10.1364/AO.54.002441

    Article  Google Scholar 

  17. Li, K., Bai, L., Li, Y., Zhou, Z.: Distortion correction algorithm of ar-hud virtual image based on neural network model of spatial continuous mapping. In: 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), pp. 178–183 (2020). https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00055

  18. Merenda, C., Smith, M., Gabbard, J., Burnett, G., Large, D.: Effects of real-world backgrounds on user interface color naming and matching in automotive ar huds. In: 2016 IEEE VR 2016 Workshop on Perceptual and Cognitive Issues in AR (PERCAR), pp. 1–6 (2016). https://doi.org/10.1109/PERCAR.2016.7562419

  19. Moser, K.R., Swan, J.E.: Evaluation of hand and stylus based calibration for optical see-through head-mounted displays using leap motion. In: 2016 IEEE Virtual Reality (VR), pp. 233–234 (2016). https://doi.org/10.1109/VR.2016.7504739

  20. Owen, C.B., Zhou, J., Tang, A., Xiao, F.: Display-relative calibration for optical see-through head-mounted displays. In: Third IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 70–78 (2004). https://doi.org/10.1109/ISMAR.2004.28

  21. Park, H.S., Kim, K.h.: Efficient information representation method for driver-centered ar-hud system. In: International Conference of Design, User Experience, and Usability, pp. 393–400 (2013). https://doi.org/10.1007/978-3-642-39238-2_43

  22. Park, H.S., Park, M.W., Won, K.H., Kim, K.H., Jung, S.K.: In-vehicle ar-hud system to provide driving-safety information. ETRI J. 35(6), 1038–1047 (2013). https://doi.org/10.4218/etrij.13.2013.0041

    Article  Google Scholar 

  23. Park, K.H., Lee, H.N., Kim, H.S., Kim, J.I., Lee, H., Pyeon, M.W.: Ar-hud system for tower crane on construction field. In: 2011 IEEE International Symposium on VR Innovation, pp. 261–266 (2011). https://doi.org/10.1109/ISVRI.2011.5759648

  24. Plopski, A., Itoh, Y., Nitschke, C., Kiyokawa, K., Klinker, G., Takemura, H.: Corneal-imaging calibration for optical see-through head-mounted displays. IEEE Trans. Visual Comput. Graphics 21(4), 481–490 (2015). https://doi.org/10.1109/TVCG.2015.2391857

    Article  Google Scholar 

  25. Tsai, R.Y.: A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-the-shelf tv cameras and lenses. Int. Conf. Robot. Autom. 3(4), 323–344 (1987). https://doi.org/10.1109/JRA.1987.1087109

    Article  Google Scholar 

  26. Tuceryan, M., Genc, Y., Navab, N.: Single-point active alignment method (spaam) for optical see-through hmd calibration for augmented reality. Presence Teleoperators Virtual Environ. 11(3), 259–276 (2002). https://doi.org/10.1162/105474602317473213

    Article  Google Scholar 

  27. Ueno, K., Komuro, T.: [poster] overlaying navigation signs on a road surface using a head-up display. In: Mixed and Augmented Reality (ISMAR), 2015 IEEE International Symposium on, pp. 168–169 (2015). https://doi.org/10.1109/ISMAR.2015.48

  28. Wetzstein, G., Lanman, D., Hirsch, M., Raskar, R.: Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting. ACM Trans. Graph. (2012). https://doi.org/10.1145/2185520.2185576

    Article  Google Scholar 

  29. Wientapper, F., Wuest, H., Rojtberg, P., Fellner, D.W.: A camera-based calibration for automotive augmented reality head-up-displays. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 189–197 (2013). https://doi.org/10.1109/ISMAR.2013.6671779

  30. Wu, W., Blaicher, F., Yang, J., Seder, T., Cui, D.: A prototype of landmark-based car navigation using a full-windshield head-up display system. In: Proceedings of the 2009 Workshop on Ambient Media Computing, AMC ’09, pp. 21–28 (2009). https://doi.org/10.1145/1631005.1631012

  31. Xia, X., Guan, Y., State, A., Cham, T.J., Fuchs, H.: Towards efficient 3d calibration for different types of multi-view autostereoscopic 3d displays. In: Proceedings of Computer Graphics International 2018, CGI 2018, p. 169–174. Association for Computing Machinery, New York, NY, USA (2018). https://doi.org/10.1145/3208159.3208190

  32. Yoon, C., Kim, K., Baek, S., Park, S.Y.: Development of augmented in-vehicle navigation system for head-up display. In: 2014 International Conference on Information and Communication Technology Convergence (ICTC), pp. 601–602 (2014). https://doi.org/10.1109/ICTC.2014.6983221

  33. Yoshida, S.: Fvision: Glasses-free tabletop 3-d display – its design concept and prototype. In: Digital Holography and Three-Dimensional Imaging, p. DTuA1 (2011). https://doi.org/10.1364/DH.2011.DTuA1

  34. Zhan, T., Lee, Y.H., Wu, S.T.: High-resolution additive light field near-eye display by switchable pancharatnam-berry phase lenses. Opt. Express 26(4), 4863–4872 (2018). https://doi.org/10.1364/OE.26.004863

    Article  Google Scholar 

  35. Zhang, Z.: A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000). https://doi.org/10.1109/34.888718

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nianchen Deng.

Ethics declarations

Conflict of interest

Disclosure of potential conflicts of interest: No.

Research

Research grants (funding agencies and grant number): SAIC Foundation, 2018310031004252

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Deng, N., Ye, J., Chen, N. et al. Towards Stereoscopic On-vehicle AR-HUD. Vis Comput 37, 2527–2538 (2021). https://doi.org/10.1007/s00371-021-02209-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-021-02209-z

Keywords

Navigation