Skip to main content

Avatar: A Telepresence System for the Participation in Remote Events

  • Conference paper
  • First Online:
Reliability and Statistics in Transportation and Communication (RelStat 2022)

Abstract

A unique telepresence helium blimp drone for the purpose of participating in remote events is presented. The blimp is safe for humans, can be controlled remotely over the internet, and can provide a high-quality 4K video stream to multiple users at the same time. We show that a careful selection of up-to-date hardware enables the blimp to run computationally expensive algorithms like SLAM and deep learning neural networks without relying on a local ground station to perform complex tasks.

Additionally, we compare the performance of the ORB-SLAM2 algorithm when used with recent depth estimating neural networks in RGB-D mode to its performance when used with conventional passive stereo images.

Our experiments show that the ORB-SLAM2 algorithm can be used with estimated depth images to reliably detect turns of the vehicle. However, in this case the algorithm also underestimates the covered distance for straight movements in one direction which results in very inaccurate computed positions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Mur-Artal, R., Tardós, J.D.: Orb-slam2: An open-source slam system for monocular, stereo, and Rgb-D cameras. IEEE Trans. Rob. 33(5), 1255–1262 (2017)

    Article  Google Scholar 

  2. Kristoffersson, A., Coradeschi, S., Loutfi, A.: A review of mobile robotic telepresence. Adv. Hum.-Comput. Interact. 2013, Article ID 902316 (2013). https://doi.org/10.1155/2013/902316

  3. Zhang, X., Braley, S., Rubens, C., Merritt, T., Vertegaal, R.: LightBee: A self-levitating light field display for hologrammatic telepresence. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–10 (2019)

    Google Scholar 

  4. Tobita, H., Maruyama, S., Kuzi, T.: Floating avatar: telepresence system using blimps for communication and entertainment. In: CHI 2011 Extended Abstracts on Human Factors in Computing Systems, pp. 541–550 (2011)

    Google Scholar 

  5. Wojciechowska, A., Frey, J., Sass, S., Shafir, R., Cauchard, J.R.: Collocated human-drone interaction: Methodology and approach strategy. In: 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 172–181. IEEE, Daegu Republic of Korea (2019)

    Google Scholar 

  6. Van Asares, A., Ko, P.S., Minlay, J.S., Sarmiento, B.R., Chua, A.: Design of an unmanned aerial vehicle blimp for indoor applications. Int. J. Mech. Eng. Rob. Res. 8(1), 157 (2019)

    Google Scholar 

  7. Yao, N.-S., et al.: Autonomous flying blimp interaction with human in an indoor space. Front. Inf. Technol. Electron. Eng. 20(1), 45–59 (2019). https://doi.org/10.1631/FITEE.1800587

    Article  Google Scholar 

  8. Yousif, K., Bab-Hadiashar, A., Hoseinnezhad, R.: An overview to visual odometry and visual SLAM: applications to mobile robotics. Intell. Ind. Syst. 1(4), 289–311 (2015)

    Article  Google Scholar 

  9. KITTI SLAM Evaluation (2012). http://www.cvlibs.net/datasets/kitti/eval_odometry.php. Accessed 08 July 2022

  10. Luxonis OAK-D. https://shop.luxonis.com/products/oak-d. Accessed 10 July 2022

  11. Raspberry Zero 2 W. https://www.raspberrypi.com/products/raspberry-pi-zero-2-w/. Accessed 10 July 2022

  12. Wofk, D., Ma, F., Yang, T.J., Karaman, S., Sze, V.: Fastdepth: fast monocular depth estimation on embedded systems. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 6101–6108. IEEE, Montreal (2019)

    Google Scholar 

  13. Li, Z., Snavely, N.: Megadepth: learning single-view depth prediction from internet photos. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2041–2050. IEEE, Salt Lake City (2018)

    Google Scholar 

  14. Farooq Bhat, S., Alhashim, I., Wonka, P.: AdaBins: depth estimation using adaptive bins. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4009–4018. IEEE, Nashville (2021)

    Google Scholar 

  15. Geiger, A., Lenz, P., Stiller, C., Urtasun, R.: Vision meets robotics: the kitti dataset. Int. J. Rob. Res. 32(11), 1231–1237 (2013)

    Article  Google Scholar 

  16. Official Adabins implementation. https://github.com/shariqfarooq123/AdaBins. Accessed 10 Aug 2022

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dietrich Trepnau .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Trepnau, D., Richter, K. (2023). Avatar: A Telepresence System for the Participation in Remote Events. In: Kabashkin, I., Yatskiv, I., Prentkovskis, O. (eds) Reliability and Statistics in Transportation and Communication. RelStat 2022. Lecture Notes in Networks and Systems, vol 640. Springer, Cham. https://doi.org/10.1007/978-3-031-26655-3_23

Download citation

Publish with us

Policies and ethics