Precision Agriculture

, Volume 19, Issue 4, pp 630–647 | Cite as

Mobile low-cost 3D camera maize crop height measurements under field conditions

  • Martin Hämmerle
  • Bernhard Höfle


To tackle global challenges such as food supply and renewable energy provision, the improvement of efficiency and productivity in agriculture is of high importance. Site-specific information about crop height plays an important role in reaching these goals. Crop height can be derived with a variety of approaches including the analysis of three-dimensional (3D) geodata. Crop height values derived from 3D geodata of maize (1.88 and 2.35 m average height) captured with a low-cost 3D camera were examined. Data were collected with a unique measurement setup including the mobile mounting of the 3D camera, and data acquisition under field conditions including wind and sunlight. Furthermore, the data were located in a global co-ordinate system with a straightforward approach, which can strongly reduce computational efforts and which can subsequently support near real-time data processing in the field. Based upon a comparison between crop height values derived from 3D geodata captured with the low-cost approach, and high-end terrestrial laser scanning reference data, minimum RMS and standard deviation values of 0.13 m (6.91% of average crop height), and maximum R2 values of 0.79 were achieved. It can be concluded that the crop height measurements derived from data captured with the introduced setup can provide valuable input for tasks such as biomass estimation. Overall, the setup is considered to be a valuable extension for agricultural machines which will provide complementary crop height measurements for various agricultural applications.


Maize crop height Mobile field measurements Low-cost 3D camera Terrestrial laser scanning 



We want to thank Sabrina Marx and Katharina Anders for their support in the field campaigns. Furthermore, many thanks to Markus Wolf and Steffen Linnenbach for granting access to the maize field.


This study was performed within the research project ‘4D Near Real-Time Environmental Monitoring (4DEMON)’ funded by the Federal Ministry of Science, Research and Arts (MWK), Baden-Wuerttemberg, Germany.

Author’s contributions

MH and BH designed the experiments. Experiments were performed by MH. MH drafted the manuscript with help and contributions from BH. Analyses, figures and tables were mainly performed and produced by MH. Both authors read and approved the final manuscript.

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.


  1. Andújar, D., Dorado, J., Fernández-Quintanilla, C., & Ribeiro, A. (2016a). An approach to the use of depth cameras for weed volume estimation. Sensors, 16, 972. doi: 10.3390/s16070972.CrossRefGoogle Scholar
  2. Andújar, D., Fernández-Quintanilla, C., & Dorado, J. (2015). Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry. Sensors, 15, 12999–13011. doi: 10.3390/s150612999.CrossRefPubMedGoogle Scholar
  3. Andújar, D., Ribeiro, A., Fernández-Quintanilla, C., & Dorado, J. (2016b). Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Computers and Electronics in Agriculture, 122, 67–73. doi: 10.1016/j.compag.2016.01.018.CrossRefGoogle Scholar
  4. Bareth, G., Bendig, J., Tilly, N., Hoffmeister, D., Aasen, H., & Bolten, A. (2016). A comparison of UAV- and TLS-derived plant height for crop monitoring: Using polygon grids for the analysis of crop surface models (CSMs). Photogrammetrie Fernerkundung Geoinformation, 2, 85–94. doi: 10.1127/pfg/2016/0289.CrossRefGoogle Scholar
  5. Bendig, J., Yu, K., Aasen, H., Bolten, A., Bennertz, S., Broscheit, J., et al. (2015). Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. International Journal of Applied Earth Observation and Geoinformation, 39, 79–87. doi: 10.1016/j.jag.2015.02.012.CrossRefGoogle Scholar
  6. Crommelinck, S., & Höfle, B. (2016). Simulating an autonomously operating low-cost static terrestrial LiDAR for multitemporal maize crop height measurements. Remote Sensing, 8(3), 205. doi: 10.3390/rs8030205.CrossRefGoogle Scholar
  7. Digumarti, S. T., Taneja, A., Thomas, A., Chaurasia, G., Siegwart, R., & Beardsley, P. (2016). Under-water 3D capture using a low-cost commercial depth camera. In: IEEE Winter Conference on Applications of Computer Vision (WACV) (pp. 1–9).Google Scholar
  8. Ehlert, D., Heisig, M., & Adamek, R. (2010). Suitability of a laser rangefinder to characterize winter wheat. Precision Agriculture, 11, 650–663. doi: 10.1007/s11119-010-9191-4.CrossRefGoogle Scholar
  9. Eitel, J. U. H., Höfle, B., Vierling, L. A., Abellán, A., Asner, G. P., Deems, J. S., et al. (2016). Beyond 3-D: The new spectrum of LiDAR applications for earth and ecological sciences. Remote Sensing of Environment, 186, 372–392. doi: 10.1016/j.rse.2016.08.018.CrossRefGoogle Scholar
  10. Eitel, J. U. H., Magney, T. S., Vierling, L. A., Brown, T. T., & Huggins, D. R. (2014). LiDAR based biomass and crop nitrogen estimates for rapid, non-destructive assessment of wheat nitrogen status. Field Crops Research, 159, 21–32. doi: 10.1016/j.fcr.2014.01.008.CrossRefGoogle Scholar
  11. Erten, E., Lopez-Sanchez, J. M., Yuzugullu, O., & Hajnsek, I. (2016). Retrieval of agricultural crop height from space: A comparison of SAR techniques. Remote Sensing of Environment, 187, 130–144. ISSN 0034-4257. doi: 10.1016/j.rse.2016.10.007.
  12. Font, D., Pallejà, T., Tresanchez, M., Runcan, D., Moreno, J., Martínez, D., et al. (2014). A proposal for automatic fruit harvesting by combining a low cost stereovision camera and a robotic arm. Sensors, 14(7), 11557–11579. doi: 10.3390/s140711557.CrossRefPubMedGoogle Scholar
  13. Friedli, M., Kirchgessner, N., Grieder, C., Liebisch, F., Mannale, M., & Walter, A. (2016). Terrestrial 3D laser scanning to track the increase in canopy height of both monocot and dicot crop species under field conditions. Plant Methods, 12(1), 1–15. doi: 10.1186/s13007-016-0109-7.CrossRefGoogle Scholar
  14. Fürsattel, P., Placht, S., Schaller, C., Balda, M., Hofmann, H., Maier, L., et al. (2016). A comparative error analysis of current time-of-flight sensors. IEEE Transactions on Computational Imaging, 2(1), 27–41. doi: 10.1109/TCI.2015.2510506.CrossRefGoogle Scholar
  15. Gonzalez-de-Soto, M., Emmi, L., Garcia, I., & Gonzalez-de-Santos, P. (2015). Reducing fuel consumption in weed and pest control using robotic tractors. Computers and Electronics in Agriculture, 114, 96–113. ISSN 0168-1699. doi: 10.1016/j.compag.2015.04.003.
  16. Gonzalez-Jorge, H., Rodríguez-Gonzálvez, P., Martínez-Sánchez, J., González-Aguilera, D., Arias, P., Gesto, M., et al. (2015). Metrological comparison between Kinect I and Kinect II sensors. Measurement, 70, 21–26. doi: 10.1016/j.measurement.2015.03.042.CrossRefGoogle Scholar
  17. Grenzdörffer, G. J. (2014). Crop height determination with UAS point clouds. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 40(1), 135–140. doi: 10.5194/isprsarchives-XL-1-135-2014.CrossRefGoogle Scholar
  18. Hämmerle, M., & Höfle, B. (2016). Direct derivation of maize plant and crop height from low-cost time-of-flight camera measurements. Plant Methods, 12, 50. doi: 10.1186/s13007-016-0150-6.CrossRefPubMedPubMedCentralGoogle Scholar
  19. Lachat, E., Macher, H., Mittet, M. A., Landes, T., & Grussenmeyer, P. (2015). First experiences with Kinect v2 sensor for close range 3D modelling. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XL-5/W4, 93–100. doi: 10.5194/isprsarchives-XL-5-W4-93-2015.CrossRefGoogle Scholar
  20. Lancashire, P. D., Bleiholder, H., Langeluddecke, P., Stauss, R., van den Boom, T., Weber, E., et al. (1991). A uniform decimal code for growth stages of crops and weeds. Annals of Applied Biology, 119, 561–601. doi: 10.1111/j.1744-7348.1991.tb04895.x.CrossRefGoogle Scholar
  21. Li, W., Niu, Z., Huang, N., Wang, C., Gao, S., & Wu, C. (2015). Airborne LiDAR technique for estimating biomass components of maize: A case study in Zhangye City, Northwest China. Ecological Indicators, 57, 486–496. doi: 10.1016/j.ecolind.2015.04.016.CrossRefGoogle Scholar
  22. Mankoff, K. D., & Russo, T. A. (2013). The Kinect: A low-cost, high-resolution, short-range 3D camera. Earth Surface Processes and Landforms, 38(9), 926–936. doi: 10.1002/esp.3332.CrossRefGoogle Scholar
  23. Marinello, F., Pezzuolo, A., Gasparini, F., Arvidsson, J., & Sartori, L. (2015). Application of the Kinect sensor for dynamic soil surface characterization. Precision Agriculture, 16(6), 601–612. doi: 10.1007/s11119-015-9398-5.CrossRefGoogle Scholar
  24. Meier, U. (2001). Growth stages of mono and dicotyledonous plants. BBCH Monograph (2nd ed.). Braunschweig, Germany: German Federal Biological Research Centre for Agriculture and Forestry. Retrieved September 26, 2017 from
  25. Microsoft. (2016). Kinect for Xbox One hardware specifications. Retrieved September 26, 2017 from
  26. Paulus, S., Behmann, J., Mahlein, A. K., Plümer, L., & Kuhlmann, H. (2014). Low-cost 3D systems: Suitable tools for plant phenotyping. Sensors, 14, 3001–3018. doi: 10.3390/s140203001.CrossRefPubMedGoogle Scholar
  27. Perez-Harguindeguy, N., Diaz, S., Garnier, E., Lavorel, S., Poorter, H., Jaureguiberry, P., et al. (2013). New handbook for standardised measurement of plant functional traits worldwide. Australian Journal of Botany, 61, 167–234. doi: 10.1071/BT12225.CrossRefGoogle Scholar
  28. Pfeifer, N., Mandlburger, G., Otepka, J., & Karel, W. (2014). OPALS—A framework for Airborne Laser Scanning data analysis. Computers, Environment and Urban Systems, 45, 125–136. doi: 10.1016/j.compenvurbsys.2013.11.002.CrossRefGoogle Scholar
  29. Riegl. (2016). Laser Measurement Systems GmbH. VZ-400 data sheet. Retrieved September 26, 2017 from
  30. Rusu, R. B., & Cousins, S. (2011). 3D is here: Point Cloud Library (PCL). IEEE International Conference on Robotics and Automation (ICRA). doi: 10.1109/ICRA.2011.5980567.CrossRefGoogle Scholar
  31. Saeys, W., Lenaerts, B., Craessaerts, G., & Baerdemaeker, J. D. (2009). Estimation of the crop density of small grains using LiDAR sensors. Biosystems Engineering, 102, 22–30. doi: 10.1016/j.biosystemseng.2008.10.003.CrossRefGoogle Scholar
  32. Sarbolandi, H., Lefloch, D., & Kolb, A. (2015). Kinect range sensing: Structured-light versus time-of-flight Kinect. Computer Vision and Image Understanding, 139, 1–20. doi: 10.1016/j.cviu.2015.05.006.CrossRefGoogle Scholar
  33. Schima, R., Mollenhauer, H., Grenzdörffer, G., Merbach, I., Lausch, A., Dietrich, P., et al. (2016). Imagine all the plants: Evaluation of a light-field camera for on-site crop growth monitoring. Remote Sensing, 8, 823. doi: 10.3390/rs8100823.CrossRefGoogle Scholar
  34. Sharma, L. K., Bu, H., Franzen, D. W., & Denton, A. (2014). Use of corn height measured with an acoustic sensor improves yield estimation with ground based active optical sensors. Computers and Electronics in Agriculture, 124, 254–262. doi: 10.1016/j.compag.2016.04.016.CrossRefGoogle Scholar
  35. Tilly, N., Hoffmeister, D., Cao, Q., Lenz-Wiedemann, V., Miao, Y., & Bareth, G. (2015). Transferability of models for estimating paddy rice biomass from spatial plant height data. Agriculture, 5, 538–560. doi: 10.3390/agriculture5030538.CrossRefGoogle Scholar
  36. Tongyu, T., Zheng, B., Xu, Z., Yang, Y., Chen, Y., & Guo, Y. (2016). Simplification of leaf surfaces from scanned data: Effects of two algorithms on leaf morphology. Computers and Electronics in Agriculture, 121, 393–403. doi: 10.1016/j.compag.2016.01.010.CrossRefGoogle Scholar
  37. Vescovo, L., Gianelle, D., Dalponte, M., Miglietta, F., Carotenuto, F. & Torresan, C. (2016). Hail defoliation assessment in corn (Zea mays L.) using airborne LiDAR. Field Crops Research, 196, 426–437. ISSN 0378-4290. doi: 10.1016/j.fcr.2016.07.024.
  38. Yandún Narváez, F. J., Salvo del Pedregal, J., Prieto, P. A., Torres-Torriti, M.,& Auat Cheein, F. A. (2016). LiDAR and thermal images fusion for ground-based 3D characterisation of fruit trees. Biosystems Engineering, 151, 479–494. ISSN 1537-5110. doi: 10.1016/j.biosystemseng.2016.10.012.
  39. Yang, L., Noguchi, N., & Takai, R. (2016). Development and application of a wheel-type robot tractor. Engineering in Agriculture, Environment and Food, 9(2), 131–140. ISSN 1881-8366. doi: 10.1016/j.eaef.2016.04.003.
  40. Young, D. L., Kwon, T. J., Smith, E. G., & Young, F. L. (2003). Site-specific herbicide decision model to maximize profit in winter wheat. Precision Agriculture, 4, 227–238. doi: 10.1023/A:1024517624527.CrossRefGoogle Scholar
  41. Yu, J., Li, C., & Paterson, A. H. (2016). High throughput phenotyping of cotton plant height using depth images under field conditions. Computers and Electronics in Agriculture, 130, 57–68., ISSN 0168-1699. doi: 10.1016/j.compag.2016.09.017.
  42. Zhang, Q. (2015). Precision agriculture technology for crop farming. Washington, DC, USA: CRC Press.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.GIScience Research Group, Institute of GeographyHeidelberg UniversityHeidelbergGermany
  2. 2.Heidelberg Center for the Environment (HCE)Heidelberg UniversityHeidelbergGermany

Personalised recommendations