Journal of Intelligent & Robotic Systems

, Volume 76, Issue 3–4, pp 539–562 | Cite as

A Practical Multirobot Localization System

  • Tomáš Krajník
  • Matías Nitsche
  • Jan Faigl
  • Petr Vaněk
  • Martin Saska
  • Libor Přeučil
  • Tom Duckett
  • Marta Mejail
Article

Abstract

We present a fast and precise vision-based software intended for multiple robot localization. The core component of the software is a novel and efficient algorithm for black and white pattern detection. The method is robust to variable lighting conditions, achieves sub-pixel precision and its computational complexity is independent of the processed image size. With off-the-shelf computational equipment and low-cost cameras, the core algorithm is able to process hundreds of images per second while tracking hundreds of objects with millimeter precision. In addition, we present the method’s mathematical model, which allows to estimate the expected localization precision, area of coverage, and processing speed from the camera’s intrinsic parameters and hardware’s processing capacity. The correctness of the presented model and performance of the algorithm in real-world conditions is verified in several experiments. Apart from the method description, we also make its source code public at http://purl.org/robotics/whycon; so, it can be used as an enabling technology for various mobile robotic problems.

Keywords

Localization Mobile robotics Computer vision Swarm robotics 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Thrun, S., Burgard, W., Fox, D., et al.: Probabilistic robotics, vol. 1. MIT press Cambridge (2005)Google Scholar
  2. 2.
    Breitenmoser, A., Kneip, L., Siegwart, R.: A monocular vision-based system for 6D relative robot localization. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 79–85 (2011)Google Scholar
  3. 3.
    Yamamoto, Y., et al.: Optical sensing for robot perception and localization. In: IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14–17. IEEE (2005)Google Scholar
  4. 4.
    Vicon: Vicon MX Systems. http://www.vicon.com. [cited 8 Jan 2014]
  5. 5.
    Mellinger, D., Michael, N., Kumar, V.: Trajectory generation and control for precise aggressive maneuvers with quadrotors. Int. J. Robot. Res. 31(5), 664–674 (2012)CrossRefGoogle Scholar
  6. 6.
    Fiala, M.: ’ARTag’, an improved marker system based on artoolkit (2004)Google Scholar
  7. 7.
    Wagner, D., Schmalstieg, D.: ARToolKitPlus for pose tracking on mobile devices. In: Proceedings of 12th Computer Vision Winter Workshop, pp. 139–146 (2007)Google Scholar
  8. 8.
    Kato, D.H.: ARToolKit. http://www.hitl.washington.edu/artoolkit/, [cited 8 Jan 2014]
  9. 9.
    Fiala, M.: Vision guided control of multiple robots. In: First Canadian Conference on Computer and Robot Vision, pp. 241–246 (2004)Google Scholar
  10. 10.
    Rekleitis, I., Meger, D., Dudek, G.: Simultaneous planning, localization, and mapping in a camera sensor network. Robot. Auton. Syst. 54(11) (2006)Google Scholar
  11. 11.
    Stump, E., Kumar, V., Grocholsky, B., Shiroma, P.M.: Control for localization of targets using rangeonly sensors. Int. J. Robot. Res. (2009)Google Scholar
  12. 12.
    Fiala, M.: Comparing ARTag and ARtoolkit plus fiducial marker systems. In: Haptic Audio Visual Environments and their Applications, pp. 6–pp. IEEE (2005)Google Scholar
  13. 13.
    Bošnak, M., Matko, D., Blažič, S.: Quadrocopter hovering using position-estimation information from inertial sensors and a high-delay video system. J. Intell. Robot. Syst. 67(1), 43–60 (2012)CrossRefGoogle Scholar
  14. 14.
    ArUco: a minimal library for augmented reality applications based on opencv. http://www.uco.es/investiga/grupos/ava/node/26. [cited 8 Jan 2014]
  15. 15.
    Ahn, S.J., Rauh, W., Recknagel, M.: Circular coded landmark for optical 3d-measurement and robot vision. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1128–1133. IEEE (1999)Google Scholar
  16. 16.
    Yang, S., Scherer, S., Zell, A.: An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. J. Intell. Robot. Syst. 69(1–4), 499–515 (2012)Google Scholar
  17. 17.
    Lo, D., Mendonča, P.R., Hopper, A., et al.: TRIP: A low-cost vision-based location system for ubiquitous computing. Pers. Ubiquit. Comput. 6(3) (2002)Google Scholar
  18. 18.
    Pedre, S., Krajník, T., Todorovich, E., Borensztejn, P.: Hardware/software co-design for real time embedded image processing: A case study. In: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, pp. 599–606. Springer (2012)Google Scholar
  19. 19.
    Kulich, M., et al.: Syrotek - distance teaching of mobile robotics. IEEE Trans. Educ. 56(1), 18–23 (2013)CrossRefGoogle Scholar
  20. 20.
    Pedre, S., Krajník, T., Todorovich, E., Borensztejn, P.: Accelerating embedded image processing for real time: a case study. J. Real-Time Image Process. (2013)Google Scholar
  21. 21.
    Heikkila, J., Silven, O.: A four-step camera calibration procedure with implicit image correction. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1106–1112 (1997)Google Scholar
  22. 22.
    Yang, S., Scherer, S.A., Zell, A.: An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. J. Intell. Robot. Syst. 69(1–4), 499–515 (2013)CrossRefGoogle Scholar
  23. 23.
    Krajník, T., Nitsche, M., Faigl, J.: The WhyCon system. http://purl.org/robotics/whycon, [cited 8 Jan 2014]
  24. 24.
    Faigl, J., Krajník, T., Chudoba, J., Přeučil, L., Saska, M.: Low-cost embedded system for relative localization in robotic swarms. In: Proceedings of IEEE International Conference on Robotics and Automation (ICRA), pp. 985–990, IEEE, Piscataway (2013)Google Scholar
  25. 25.
    Saska, M., Krajník, T., Přeučil, L.: Cooperative Micro UAV-UGV Autonomous Indoor Surveillance. In: International Multi-Conference on Systems, Signals and Devices, p. 36, IEEE, Piscataway (2012)Google Scholar
  26. 26.
    Kernbach, S., et al.: Symbiotic robot organisms: Replicator and symbrion projects. In: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, pp. 62–69. ACM (2008)Google Scholar
  27. 27.
    Cajtler, V.: Syrotek localization system. Bachelor thesis, Dept. of Cybernetics, CTU (2013). In CzechGoogle Scholar
  28. 28.
    Krajník, T., Přeučil, L.: A simple visual navigation system with convergence property. In: Proceedings European Robotics Symposium (EUROS), pp. 283–292 (2008)Google Scholar
  29. 29.
    Krajník, T., Nitsche, M., Pedre, S., Přeučil, L., Mejail, M.: A Simple Visual Navigation System for an UAV. In: International Multi-Conference on Systems, Signals and Devices, p. 34, IEEE, Piscataway (2012)Google Scholar
  30. 30.
    Krajník, T., et al.: Simple, yet stable bearing-only navigation. J. Field Robot. 27(5), 511–533 (2010)CrossRefGoogle Scholar
  31. 31.
    Hawes, N.: STRANDS - Spatial-Temporal Representations and Activities for Cognitive Control in Long-Term Scenarios. http://www.strands-project.eu, [cited 8 Jan 2014]

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  • Tomáš Krajník
    • 1
    • 2
  • Matías Nitsche
    • 3
  • Jan Faigl
    • 2
  • Petr Vaněk
    • 2
  • Martin Saska
    • 2
  • Libor Přeučil
    • 2
  • Tom Duckett
    • 1
  • Marta Mejail
    • 3
  1. 1.Lincoln Centre for Autonomous Systems, School of Computer ScienceUniversity of LincolnLincolnUK
  2. 2.Faculty of Electrical EngineeringCzech Technical University in PraguePragueCzech Republic
  3. 3.Laboratory of Robotics and Embedded Systems Faculty of Exact and Natural SciencesUniversity of Buenos AiresBuenos AiresArgentina

Personalised recommendations