Advertisement

TAPAS: A Robotic Platform for Autonomous Navigation in Outdoor Environments

  • Adam Bondyra
  • Michał Nowicki
  • Jan Wietrzykowski
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 351)

Abstract

Nowadays robotic researches are concerned about autonomous and robust operation outdoors in order to perform a variety of practical applications. Therefore, we present a robotic platform TAPAS designed for autonomous navigation in the man-made environments, like parks, and capable of transporting 5 kg payload. The article presents the hardware design and sensory system that allowed to create a fully autonomous vehicle unique due to its low cost, light weight and long battery duration. Presented solution was already thoroughly evaluated at the international robotic competition Robotour 2014, where TAPAS took ex aequo 4th place out of 13 robots. Taking part in the competition provided feedback that is discussed in the article and will be used for further developments.

Keywords

mobile robot system design sensors autonomous navigation outdoors 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Belter, D., Walas, K.: A Compact Walking Robot – Flexible Research and Development Platform. In: Szewczyk, R., Zieliński, C., Kaliczyńska, M. (eds.) Recent Advances in Automation, Robotics and Measuring Techniques. AISC, vol. 267, pp. 343–352. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  2. 2.
    Berns, K., Kuhnert, K.-D., Armbrust, C.: Off-road robotics – an overview. KI-Künstliche Intelligenz, Journal 25(2), 109–116 (2011)CrossRefGoogle Scholar
  3. 3.
    Bouguet, J.Y.: Camera Calibration Toolbox for Matlab (2008)Google Scholar
  4. 4.
    Gmerek, A., et al.: AutoLUT – Autonomiczny robot mobilny oparty na systemie ROS. In: Krajowa Konferencja Robotyki 2014, Postępy Robotyki (2014) (in Polish) Google Scholar
  5. 5.
    Konolige, K., et al.: Mapping, Navigation, and Learning for Off-Road Traversal. Journal of Field Robotics (JFR) 26(1), 88–113 (2009)CrossRefGoogle Scholar
  6. 6.
    Konolige, K., et al.: Outdoor Mapping and Navigation Using Stereo Vision. Experimental Robotics 39, 179–190 (2008)CrossRefGoogle Scholar
  7. 7.
    Lobo, J., Dias, J.: Relative Pose Calibration Between Visual and Inertial Sensors. International Journal of Robotics Research (IJRR) 26(6), 561–575 (2007)CrossRefGoogle Scholar
  8. 8.
    Nowicki, M., Skrzypczyński, P.: Combining photometric and depth data for lightweight and robust visual odometry. In: Proc. IEEE European Conference on Mobile Robots (ECMR), pp. 125–130 (September 2013)Google Scholar
  9. 9.
    Taneja, A., Ballan, L., Pollefeys, M.: Never Get Lost Again: Vision Based Navigation using StreetView Images. In: Proc. of Asian Conference of Computer Vision, ACCV (2014)Google Scholar
  10. 10.
    Wietrzykowski, J., Belter, D.: Boosting Support Vector Machines for RGB-D Based Terrain Classification. Journal of Automation, Mobile Robotics & Intelligent Systems (JAMRIS) 8(3), 28–34 (2014)CrossRefGoogle Scholar
  11. 11.
    Wietrzykowski, J., Nowicki, M., Bondyra, A.: Exploring openStreetMap publicly available information for autonomous robot navigation. In: Szewczyk, R., et al. (eds.) Progress in Automation, Robotics and Measuring Techniques. AISC, vol. 351, pp. 309–318. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  12. 12.
    Zhang, Q., Pless, R.: Extrinsic Calibration of a Camera and Laser Range Finder. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2301–2306 (2004)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Adam Bondyra
    • 1
  • Michał Nowicki
    • 1
  • Jan Wietrzykowski
    • 1
  1. 1.Institute of Control and Information EngineeringPoznań University of TechnologyPoznańPoland

Personalised recommendations