Skip to main content

Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection

Part of the Lecture Notes in Computer Science book series (LNISA,volume 12190)

Abstract

Accurately detecting changes in one’s environment is an important ability for many application domains, but can be challenging for humans. Autonomous robots can easily be made to autonomously detect metric changes in the environment, but unlike humans, understanding context can be challenging for robots. We present a novel system that uses an autonomous robot performing point cloud-based change detection to facilitate information-gathering tasks and provides enhanced situational awareness. The robotic system communicates detected changes via augmented reality to a human teammate for evaluation. We present results from a fielded system using two differently-equipped robots to examine implementation questions of point cloud density and its effect on visualization of changes. Our results show that there are trade-offs between implementations that we believe will be constructive towards similar systems in the future.

Keywords

  • Human-robot teaming
  • Augmented reality
  • Simultaneous Localization and Mapping (SLAM)
  • Change detection
  • Field robotics

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-49695-1_41
  • Chapter length: 18 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   84.99
Price excludes VAT (USA)
  • ISBN: 978-3-030-49695-1
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   109.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.
Fig. 8.
Fig. 9.
Fig. 10.
Fig. 11.
Fig. 12.

Notes

  1. 1.

    https://www.microsoft.com/en-us/hololens.

  2. 2.

    http://wiki.ros.org/rosbridge_suite.

  3. 3.

    https://github.com/siemens/ros-sharp.

References

  1. Dellaert, F.: Factor graphs and GTSAM: a hands-on introduction. Technical report, Georgia Institute of Technology (2012)

    Google Scholar 

  2. Dellaert, F., Kaess, M.: Square Root SAM: simultaneous localization and mapping via square root information smoothing. Int. J. Robot. Res. 25(12), 1181–1203 (2006)

    CrossRef  Google Scholar 

  3. Durlach, P.J.: Change blindness and its implications for complex monitoring and control systems design and operator training. Hum.-Comput. Interact. 19(4), 423–451 (2004)

    CrossRef  Google Scholar 

  4. Gregory, J., et al.: Application of multi-robot systems to disaster-relief scenarios with limited communication. In: Wettergreen, D.S., Barfoot, T.D. (eds.) Field and Service Robotics. STAR, vol. 113, pp. 639–653. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-27702-8_42

    CrossRef  Google Scholar 

  5. Marsland, S., Nehmzow, U., Shapiro, J.: On-line novelty detection for autonomous mobile robots. Robot. Auton. Syst. 51(2–3), 191–206 (2005)

    CrossRef  Google Scholar 

  6. Núñez, P., Drews, P., Rocha, R., Campos, M., Dias, J.: Novelty detection and 3D shape retrieval based on Gaussian mixture models for autonomous surveillance robotics. In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4724–4730. IEEE (2009)

    Google Scholar 

  7. Pimentel, M.A., Clifton, D.A., Clifton, L., Tarassenko, L.: A review of novelty detection. Sig. Process. 99, 215–249 (2014)

    CrossRef  Google Scholar 

  8. Quigley, M., Faust, J., Foote, T., Leibs, J.: ROS: an open-source Robot Operating System. In: ICRA Workshop on Open Source Software (2009)

    Google Scholar 

  9. Reardon, C., Lee, K., Fink, J.: Come see this! Augmented reality to enable human-robot cooperative search. In: Proceedings of the 2018 IEEE Symposium on Safety, Security, and Rescue Robotics (2018)

    Google Scholar 

  10. Reardon, C., Lee, K., Rogers, J.G., Fink, J.: Augmented reality for human-robot teaming in field environments. In: Chen, J.Y.C., Fragomeni, G. (eds.) HCII 2019. LNCS, vol. 11575, pp. 79–92. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-21565-1_6

    CrossRef  Google Scholar 

  11. Rusu, R.B., Cousins, S.: 3D is here: point cloud library PCL. In: 2011 IEEE International Conference on Robotics and Automation, pp. 1–4. IEEE (2011)

    Google Scholar 

  12. Segal, A., Haehnel, D., Thrun, S.: Generalized-ICP. In: Robotics: Science and Systems, vol. 2, p. 435 (2009)

    Google Scholar 

  13. Simons, D.J., Chabris, C.F.: Gorillas in our midst: sustained inattentional blindness for dynamic events. Perception 28(9), 1059–1074 (1999)

    CrossRef  Google Scholar 

  14. Sofman, B., Neuman, B., Stentz, A., Bagnell, J.A.: Anytime online novelty and change detection for mobile robots. J. Field Robot. 28(4), 589–618 (2011)

    CrossRef  Google Scholar 

  15. Sturari, M., Paolanti, M., Frontoni, E., Mancini, A., Zingaretti, P.: Robotic platform for deep change detection for rail safety and security. In: 2017 European Conference on Mobile Robots (ECMR), pp. 1–6. IEEE (2017)

    Google Scholar 

  16. Szafir, D.: Mediating human-robot interactions with virtual, augmented, and mixed reality. In: Chen, J.Y.C., Fragomeni, G. (eds.) HCII 2019. LNCS, vol. 11575, pp. 124–149. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-21565-1_9

    CrossRef  Google Scholar 

  17. Trevor, A.J.B., Rogers, J.G., Christensen, H.I.: OmniMapper: a modular multimodal mapping framework. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 1983–1990, May 2014. https://doi.org/10.1109/ICRA.2014.6907122

  18. Vieira, A.W., Drews, P.L., Campos, M.F.: Spatial density patterns for efficient change detection in 3D environment for autonomous surveillance robots. IEEE Trans. Autom. Sci. Eng. 11(3), 766–774 (2014)

    CrossRef  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christopher Reardon .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2020 This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Reardon, C., Gregory, J., Nieto-Granda, C., Rogers, J.G. (2020). Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection. In: Chen, J.Y.C., Fragomeni, G. (eds) Virtual, Augmented and Mixed Reality. Design and Interaction. HCII 2020. Lecture Notes in Computer Science(), vol 12190. Springer, Cham. https://doi.org/10.1007/978-3-030-49695-1_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-49695-1_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-49694-4

  • Online ISBN: 978-3-030-49695-1

  • eBook Packages: Computer ScienceComputer Science (R0)