Precision Agriculture

, Volume 18, Issue 3, pp 350–365 | Cite as

Platform for evaluating sensors and human detection in autonomous mowing operations

  • P. ChristiansenEmail author
  • M. Kragh
  • K. A. Steen
  • H. Karstoft
  • R. N. Jørgensen


The concept of autonomous farming concerns automatic agricultural machines operating safely and efficiently without human intervention. In order to ensure safe autonomous operation, real-time risk detection and avoidance must be undertaken. This paper presents a flexible vehicle-mounted sensor system for recording positional and imaging data with a total of six sensors, and a full procedure for calibrating and registering all sensors. Authentic data were recorded for a case study on grass-harvesting and human safety. The paper incorporates parts of ISO 18497 (an emerging standard for safety of highly automated machinery in agriculture) related to human detection and safety. The case study investigates four different sensing technologies and is intended as a dataset to validate human safety or a human detection system in grass-harvesting. The study presents common algorithms that are able to detect humans, but struggle to handle lying or occluded humans in high grass.


Safe farming Sensor platform Object detection Computer vision ISO 18497 Autonomous farming 



This research is sponsored by the Innovation Fund Denmark as part of the Project “SAFE - Safer Autonomous Farming Equipment” (Project No. 16-2014-0) and “Multi-sensor system for ensuring ethical and efficient crop production” (Project No. 155-2013-6).


  1. Bahnsen, C. (2013). Thermal-visible-depth image registration. Unpublished Master Thesis, Aalborg University, Aalborg, Denmark.Google Scholar
  2. Christiansen, P., Kragh, M., Steen, K. A., Karstoft, H., & Jørgensen, R. N. (2015). Advanced sensor platform for human detection and protection in autonomous farming. Precision Agriculture, 15, 291–298.CrossRefGoogle Scholar
  3. Christiansen, P., Steen, K. A., Jørgensen, R. N., & Karstoft, H. (2014). Automated detection and recognition of wildlife using thermal cameras. Sensors, 14(8), 13778–13793.CrossRefPubMedPubMedCentralGoogle Scholar
  4. CLAAS Steering Systems. (2011). Tracking control optimisation. Retrieved 2016, 26 September from
  5. Dollar, P., Belongie, S., & Perona, P. (2010). The fastest pedestrian detector in the west. In F. Labrosse, R. Zwiggelaar, Y. Liu & B. Tiddeman (Eds.), Proceedings of the British machine vision conference 2010 (pp 68.1–68.11). BMVA Press, Durham University, UK.Google Scholar
  6. Fischler, M. A., & Bolles, R. C. (1981). Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24(6), 381–395.CrossRefGoogle Scholar
  7. Freitas, G., Hamner, B., Bergerman, M., & Singh, S. (2012). A practical obstacle detection system for autonomous orchard vehicles. In 2012 IEEE/RSJ international conference on intelligent robots and systems (pp 3391–3398).Google Scholar
  8. ISO/DIS 18497:2015: Agricultural and forestry tractors and self-propelled machinerySafety of highly automated machinery. Retrieved 2016, 26 September from
  9. Johnson, M. J., & Bajcsy, P. (2008). Integration of thermal and visible imagery for robust foreground detection in tele-immersive spaces. In P. Solbrig (Ed.), Proceedings of the 11th international conference on information fusion (pp. 1265–1272). Piscataway, USA: IEEE.Google Scholar
  10. Krotosky, S. J., & Trivedi, M. M. (2007). Mutual information based registration of multimodal stereo videos for person tracking. Computer Vision and Image Understanding, 106(2–3), 270–287.CrossRefGoogle Scholar
  11. McLachlan, G. J., & Basford, K. E. (1988). Mixture models: Inference and applications to clustering. In Statistics: textbooks and monographs. New York, USA: Dekker.Google Scholar
  12. Paden, B., Cáp, M., Yong, Z. S., Yershov, D., & Frazzoli, E. (2016). A survey of motion planning and control techniques for self-driving urban vehicles. IEEE Transactions on Intelligent Vehicles, 1(1), 33–55. arXiv:cs.CV/1604.07446v1.
  13. Pilarski, T., Happold, M., Pangels, H., Ollis, M., Fitzpatrick, K., & Stentz, A. (2002). The Demeter System for automated harvesting. Autonomous Robots, 13, 9–20.CrossRefGoogle Scholar
  14. Rasshofer, R. H., & Gresser, K. (2005). Automotive radar and lidar systems for next generation driver assistance functions. Advances in Radio Science, 3, 205–209.CrossRefGoogle Scholar
  15. Reina, G., & Milella, A. (2012). Towards autonomous agriculture: Automatic ground detection using trinocular stereovision. Sensors, 12(12), 12405–12423.CrossRefPubMedCentralGoogle Scholar
  16. Rouveure, R., Nielsen, M., & Petersen, A. (2012). The QUAD-AV Project: Multi-sensory approach for obstacle detection in agricultural autonomous robotics. In International conference of agricultural engineering. Valencia, Spain: EurAgEng.Google Scholar
  17. Serrano-Cuerda, J., Fernández-Caballero, A., & López, M. (2014). Selection of a visible-light vs. thermal infrared sensor in dynamic environments based on confidence measures. Applied Sciences, 4(3), 331–350.CrossRefGoogle Scholar
  18. Steen, K. A., Villa-Henriksen, A., Therkildsen, O. R., & Green, O. (2012). Automatic detection of animals in mowing operations using thermal cameras. Sensors, 12(6), 7587–7597.CrossRefPubMedPubMedCentralGoogle Scholar
  19. The MathWorks, Inc. (2015). MATLAB and computer vision system toolbox. Natick, MA, USA: The MathWorks, Inc.Google Scholar
  20. Wei, J., Rovira-Mas, F., Reid, J. F., & Han, S. (2005). Obstacle detection using stereo vision to enhance safety of autonomous machines. Transactions of the ASAE, 48(6), 2389–2397. doi: 10.13031/2013.20078.CrossRefGoogle Scholar
  21. Yang, L., & Noguchi, N. (2012). Human detection for a robot tractor using omni-directional stereo vision. Computers and Electronics in Agriculture, 89, 116–125.CrossRefGoogle Scholar
  22. Zhang, Z. (1994). Iterative point matching for registration of free-form curves and surfaces. International Journal of Computer Vision, 13(2), 119–152.CrossRefGoogle Scholar
  23. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330–1334.CrossRefGoogle Scholar
  24. Zhao, J., & Cheung, S. S. (2014). Human segmentation by geometrically fusing visible-light and thermal imageries. Multimedia Tools and Applications, 76(1), 7361–7389.Google Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.Department of Engineering-Signal Processing, Faculty of Science and TechnologyAarhus UniversityAarhus NDenmark

Personalised recommendations