Advertisement

The AgriRover: A Reinvented Mechatronic Platform from Space Robotics for Precision Farming

  • Xiu-Tian YanEmail author
  • Alessandro Bianco
  • Cong Niu
  • Roberto Palazzetti
  • Gwenole Henry
  • Youhua Li
  • Wayne Tubby
  • Aron Kisdi
  • Rain Irshad
  • Stephen Sanders
  • Robin Scott
Chapter

Abstract

This paper presents an introduction of a novel development to a multi-functional mobile platform for agriculture applications. This is achieved through a reinvention process of a mechatronic design by spinning off space robotic technologies into terrestrial applications in the AgriRover project. The AgriRover prototype is the first of its kind in exploiting and applying space robotic technologies in precision farming. To optimize energy consumption of the mobile platform, a new dynamic total cost of transport algorithm is proposed and validated. An autonomous navigation system has been developed to enable the AgriRover to operate safely in unstructured farming environments. An object recognition algorithm specific to agriculture has been investigated and implemented. A novel soil sample collecting mechanism has been designed and prototyped for on-board and in situ soil quality measurement. The design of the whole system has benefited from the use of a mechatronic design process known as the Tiv model through which a planetary exploration rover is reinvented into the AgriRover for agricultural applications. The AgriRover system has gone through three sets of field trials in the UK and some of these results are reported.

Notes

Acknowledgements

The AgriRover project is funded by the UK Space Agency under its International Partnerships in Space Programme and the authors would like to thank the Agency for its financial support. The authors would like to thank the owner of the Rushyhill Farm for its use for field trials of the AgriRover. Part of this work has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 821996 for MOSAR.

References

  1. 1.
    Bloss, R. (2014). Robot innovation brings to agriculture efficiency, safety, labour savings and accuracy by plowing, milking, harvesting, crop tending/picking and monitoring. Industrial Robot, 41(6), 493–499.CrossRefGoogle Scholar
  2. 2.
    Bac, C. W., Henten, E. J., Hemming, J., & Edan, Y. (2014). Harvesting robots for high-value crops: State-of-the-art review and challenges ahead. Journal of Field Robotics, 31(6), 888–911.CrossRefGoogle Scholar
  3. 3.
    Yaghoubi, S., Akbarzadeh, N. A., Bazargani, S. S., Bamizan, M., & Asl, M. I. (2013). Autonomous robots for agricultural tasks and farm assignment and future trends in agro robots. nternational Journal of Mechanical and Mechatronics Engineering IJMME-IJENS, 13(3), 1–6.Google Scholar
  4. 4.
    Mandow, A., Gomez-de Gabriel, J., Martinez, J. L., Munoz, V. F., Ollero, A., & Garcia-Cerezo, A. (1996). The autonomous mobile robot AURORA for greenhouse operation. IEEE Robotics and Automation Magazine, 3(4), 18–28.CrossRefGoogle Scholar
  5. 5.
    Bengochea-Guevara, J. M., Conesa-Mun˜oz, J., Andu´jar, D., & Ribeiro, A. (2016). Merge fuzzy visual servoing and gps-based planning to obtain a proper navigation behavior for a small crop-inspection robot. Sensors, 16(3), 276.CrossRefGoogle Scholar
  6. 6.
    Åstrand, B., & Baerveldt, A.-J. (2002). An agricultural mobile robot with vision-based perception for mechanical weed control. Autonomous Robots, 13(1), 21–35.CrossRefGoogle Scholar
  7. 7.
    Zhu, Z.-X., Chen, J., Yoshida, T., Torisu, R., Song, Z.-H., & Mao, E.-R. (2007). Path tracking control of autonomous agricultural mobile robots. J. Zhejiang University SCIENCE A, 8(10), 1596–1603.CrossRefGoogle Scholar
  8. 8.
    Melville, C., & Yan, X.-T. (2016). Tiv Model—An attempt at breaching the industry adoption barrier for new complex system design methodologies. In P. Hehenberger, D. &Bradley (Eds.), Mechatronic futures: Challenges and solutions for mechatronic systems and their designers. Springer.Google Scholar
  9. 9.
    Post, M., Michalec, R., Bianco, A., Yan, X., De Maio, A., Labourey, Q., Lacroix, S., Gancet, J., Govindaraj, S., Marinez-Gonazalez, X., Dominguez, R., Wehbe, B., Fabich, A., Souvannavong, F., Bissonnette, V., Smisek, M., Oumer, N.W., Triebel, R., & Marton, Z. (2018). InFuse data fusion methodology for space robotics, awareness and machine learning. In 69th International Astronautical Congress, Bremen.Google Scholar
  10. 10.
    Prince, S. (2012). Computer vision: Models learning and inference. Cambridge University Press.Google Scholar
  11. 11.
    Andreopoulos, A., & Tsotsos, J. K. (2013). 50 years of object recognition: Directions forward. Computer Vision and Image Understanding, 117(8), 827–891.CrossRefGoogle Scholar
  12. 12.
    Yan, X.-T., Post, M.A., Bianco, A., Niu, C., Palazzetti, R., Lu, Y., Melville, C., Kisdi, A., & Tubby, W. (2018). The agrirover: a mechatronic platform from space robotics for precision farming. In Proceeding Mechatronics 2018: Reinventing Mechatronics, pp. 112–119.Google Scholar
  13. 13.
    Halliday, D., Resnick, R., & Walker, J. (2014). Fundamentals of physics (10th (Revised) Edition). Wiley.Google Scholar
  14. 14.
    Bhounsule, P., Cortell, J., & Ruina, A. (2012). Design and control of ranger: An energy-efficient, dynamic walking robot. In Proceedings of the 15th International Conference Climbing Walking Robots Support Technol. Mobile Mach. pp. 441–448.Google Scholar
  15. 15.
    Yan, X.-T., Donaldson, K. M., Davidson, C. M., Gao, Yichun, Hanling, Wu, Houston, A. M., et al. (2018). Effects of sample pretreatment and particle size on the determination of nitrogen in soil by portable LIBS and potential use on robotic-borne remote Martian and agricultural soil analysis systems. RSC Advances, 8(64), 36886–36894.CrossRefGoogle Scholar
  16. 16.
    Miransari, M., & Mackenzie, A. (2014) Optimal n fertilization, using total and mineral n, affecting corn (zea mays l.) grain n uptake. Journal of Plant Nutrition, 37(2), 232–243.Google Scholar
  17. 17.
    Gould, S. (2012). DARWIN: A framework for machine learning and computer vision research and development. Journal of Machine Learning Research, 13(Dec), 3533–3537.Google Scholar
  18. 18.
    Labbé, M., & Michaud, F. (2018). RTAB-map as an open-source lidar and visual SLAM library for large-scale and long-term online operation. Journal of Field Robotics, 36(2), 416–446.CrossRefGoogle Scholar
  19. 19.
    Hwang, H. Y. K., & Ahuja, N. (1992). A potential field approach to path planning. IEEE Transactions on Robotics and Automation, 8(1), 23–32.CrossRefGoogle Scholar
  20. 20.
    Seok, S., Wang, A., Chuah, M. Y., Hyun, D. J., Lee, J., Otten, D., et al. (2015). Design principles for energy-efficient legged locomotion and implementation on the MIT cheetah robot. IEEE/ASME Transactions on Mechatronics, 20(3), 1117–1129.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Xiu-Tian Yan
    • 1
    Email author
  • Alessandro Bianco
    • 1
  • Cong Niu
    • 1
  • Roberto Palazzetti
    • 1
  • Gwenole Henry
    • 1
  • Youhua Li
    • 1
  • Wayne Tubby
    • 2
  • Aron Kisdi
    • 2
  • Rain Irshad
    • 2
  • Stephen Sanders
    • 3
  • Robin Scott
    • 3
  1. 1.University of StrathclydeGlasgowUK
  2. 2.RAL SpaceDidcotUK
  3. 3.Veolia Nuclear SolutionsAbingdonUK

Personalised recommendations