Vision-Based Cooperative Localization for Small Networked Robot Teams

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8069)

Abstract

We describe the development of a vision-based cooperative localization and tracking framework for a team of small autonomous ground robots operating in an indoor environment. The objective is to enable a team of small mobile ground robots with limited sensing to explore, monitor, and search for objects of interest in regions in the workspace that may be inaccessible to larger mobile ground robots. In this work, we describe a vision-based cooperative localization and tracking framework for a team of small networked ground robots. We assume each robot is equipped with an LED-based identifier/marker, a color camera, wheel encoders, and wireless communication capabilities. Cooperative localization and tracking is achieved using the on-board cameras and local inter-agent communication. We describe our approach and present experimental and simulation results where a team of small ground vehicles cooperatively track other ground robots as they move around the workspace.

Keywords

Networked robots Localization Applications development 

References

  1. 1.
    Bishop, A.N., Anderson, B.D.O., Fidan, B., Pathirana, P.N., Mao, G.: Bearing-only localization using geometrically constrained optimization. IEEE Trans. Aerosp. Electron. Syst. 45(1), 308–320 (2009)CrossRefGoogle Scholar
  2. 2.
    Bonnifait, P., Garcia, G.: A multisensor localization algorithm for mobile robots and its real-time experimental validation. In: IEEE International Conference on Robotics and Automation, pp. 1395–1400, April 1996Google Scholar
  3. 3.
    Howard, A., Matarić, M.J., Sukhatme, G.: Putting the ’I’ in ’team’: an ego-centric approach to cooperative localization (2003)Google Scholar
  4. 4.
    Huang, G.P., Trawny, N., Mourikis, A.I., Roumeliotis, S.I.: On the consistency of multi-robot cooperative localization. In: Proceedings of Robotics Science and Systems, June 2009Google Scholar
  5. 5.
    Jetto, L., Longhi, S., Venturini, G.: Development and experimental validation of an adaptive extended Kalman filter for the localization of mobile robots. IEEE Trans. Robot. Autom. 15(2), 219–229 (1999)CrossRefGoogle Scholar
  6. 6.
    Kurazume, R., Nagata, S., Hirose, S.: Cooperative positioning with multiple robots (1994)Google Scholar
  7. 7.
    Leonard, J., Durrant-Whyte, H.: Mobile robot localization by tracking geometric beacons. IEEE Trans. Robot. Autom. 7(3), 376–382 (1991)CrossRefGoogle Scholar
  8. 8.
    Loizou, S.G., Kumar, V.: Biologically inspired bearing-only navigation and tracking. In: 2007 46th IEEE Conference on Decision and Control, pp. 1386–1391 (2007)Google Scholar
  9. 9.
    Mourikis, A., Roumeliotis, S.: Performance analysis of multirobot cooperative localization. IEEE Trans. Rob. 22(4), 666–681 (2006)CrossRefGoogle Scholar
  10. 10.
    Reza, D.S.H., Mutijarsa, K., Adiprawita, W.: Mobile robot localization using augmented reality landmark and fuzzy inference system. In: 2011 International Conference on Electrical Engineering and Informatics (ICEEI), pp. 1–6 (2011)Google Scholar
  11. 11.
    Roumeliotis, S.I., Bekey, G.A.: Collective localization: a distributed Kalman filter approach to localization of groups of mobile robots (2000)Google Scholar
  12. 12.
    Roumeliotis, S., Bekey, G.: Distributed multi-robot localization. IEEE Trans. Robot. Autom. 15(5), 781–795 (2002)CrossRefGoogle Scholar
  13. 13.
    Se, S., Lowe, D., Little, J.: Local and global localization for mobile robots using visual landmarks. In: Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 1, pp. 414–420 (2001)Google Scholar
  14. 14.
    Sharma, R., Quebe, S., Beard, R., Taylor, C.: Bearing-only cooperative localization. J. Intell. Rob. Syst., 1–12 (2013), http://dx.doi.org/10.1007/s10846-012-9809-z
  15. 15.
    Shirmohammadi, B., Taylor, C.J.: Self localizing smart camera networks. ACM Trans. Sens. Netw. 8(2), 11:1–11:24 (2012)CrossRefGoogle Scholar
  16. 16.
    Zhou, K., Roumeliotis, S.I.: Multi-robot active target tracking with combinations of relative observations. IEEE Trans. Rob. 27(4), 678–695 (2011)CrossRefGoogle Scholar
  17. 17.
    Zhou, X., Roumeliotis, S.: Robot-to-robot relative pose estimation from range measurements. IEEE Trans. Rob. 24(6), 1379–1393 (2008)CrossRefGoogle Scholar
  18. 18.
    Zhou, X.S., Roumeliotis, S.I.: Determining the robot-to-robot 3D relative pose using combinations of range and bearing measurements (Part II). In: 2011 IEEE International Conference on Robotics and Automation (Part II), pp. 4736–4743, May 2011Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.Scalable Autonomous Systems LabDrexel UniversityPhiladelphiaUSA
  2. 2.VeRLab Computer Science DepartmentFederal University of Minas GeraisBelo HorizonteBrazil

Personalised recommendations