Abstract
Hybrid rice row detection at the pollination stage is critical for the automation of in-field pollination agricultural vehicles. The parental crops of hybrid rice are planted at intervals in seed production fields with narrow inter-row spacing. During the advance of the pollination vehicle, in addition to the centerline of the crop row, information on the crop region boundaries is required to guide the vehicle and prevent it from running over the crop. For complete crop row detection, a novel machine vision-based method was presented to identify each of the individual regions of the crop rows, more than the centerlines, by line-shaped mask scanning combined with the vanishing point of the crop rows. The approach consisted of grayscale transformation, vanishing point detection, crop region identification, boundary position fine-tuning and crop region segmentation. Its region detection performance outperformed the convolutional neural network-based (CNN-based) methods with an intersection over union (IoU) of 0.832, an accuracy of 90.48%, a recall of 86.36%, a precision of 98.96% and an f1-Score of 92.23%. Its centerline extraction ability was compared with Hough Transform-based and SegNet-based methods on the basis of average lateral distance (ALD) between the ground truth line and the detected line. The proposed method resulted in an ALD of 1.943 pixels in a 640*360 resolution image, which was superior to the Hough Transform-based (5.704) and the SegNet-based (3.555) methods.
Similar content being viewed by others
References
Badrinarayanan, V., Kendall, A., & Cipolla, R. (2017). SegNet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495. https://doi.org/10.1109/tpami.2016.2644615
Bonadies, S., & Gadsden, S. A. (2018). An overview of autonomous crop row navigation strategies for unmanned ground vehicles. Engineering in Agriculture, Environment and Food, 12, 24–31. https://doi.org/10.1016/j.eaef.2018.09.001
Chen, J., Qiang, H., Wu, J., Xu, G., & Wang, Z. (2021a). Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Computers and Electronics in Agriculture, 180, 105911. https://doi.org/10.1016/j.compag.2020.105911
Chen, J., Zhang, D., Zeb, A., & Nanehkaran, Y. A. (2021b). Identification of rice plant diseases using lightweight attention networks. Expert Systems with Applications, 169, 114514. https://doi.org/10.1016/j.eswa.2020.114514
Chen, L. C., Zhu, Y., Papandreou, G., Schroff, F., & Adam, H. (2018). Encoder-decoder with atrous separable convolution for semantic image segmentation. In Proceedings of the European Conference on Computer Vision (ECCV) (pp. 801–818). https://https://doi.org/10.1007/978-3-030-01234-2_49
Fischler, M. A., & Bolles, R. C. (1981). Random sample consensus—a paradigm for model-fitting with applications to image-analysis & automated cartography. Communications of the Acm, 24(6), 381–395. https://doi.org/10.1145/358669.358692
Gerrish, J. B., Fehr, B. W., Van Ee, G. R., & Welch, D. P. (1997). Self-steering tractor guided by computer-vision. Applied Engineering in Agriculture, 13(5), 559–563. https://doi.org/10.13031/2013.21641
Han, S., Zhang, Q., Ni, B., & Reid, J. F. (2004). A guidance directrix approach to vision-based vehicle guidance systems. Computers and Electronics in Agriculture, 43(3), 179–195. https://doi.org/10.1016/j.compag.2004.01.007
Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T. et al. (2017). Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861.
Jiang, G., Wang, X., Wang, Z., & Liu, H. (2016). Wheat rows detection at the early growth stage based on Hough transform and vanishing point. Computers and Electronics in Agriculture, 123, 211–223. https://doi.org/10.1016/j.compag.2016.02.002
Jiang, Q., Wang, Y., Chen, J., Wang, J., Wei, Z., & He, Z. (2021). Optimizing the working performance of a pollination machine for hybrid rice. Computers and Electronics in Agriculture, 187, 106282. https://doi.org/10.1016/j.compag.2021.106282
Kakani, V., Nguyen, V. H., Kumar, B. P., Kim, H., & Pasupuleti, V. R. (2020). A critical review on computer vision and artificial intelligence in food industry. Journal of Agriculture and Food Research, 2, 100033. https://doi.org/10.1016/j.jafr.2020.100033
Kim, W.-S., Lee, D.-H., Kim, Y.-J., Kim, T., Hwang, R.-Y., & Lee, H.-J. (2020). Path detection for autonomous traveling in orchards using patch-based CNN. Computers and Electronics in Agriculture, 175, 105620. https://doi.org/10.1016/j.compag.2020.105620
Li, J., Lan, Y., Wang, J., Chen, S., Huang, C., Liu, Q., & Liang, Q. (2017). Distribution law of rice pollen in the wind field of small UAV. International Journal of Agricultural and Biological Engineering, 10(4), 32–40. https://doi.org/10.25165/j.ijabe.20171004.3103
Luo, J. W., Ying, K., & Bai, J. (2005). Savitzky-Golay smoothing and differentiation filter for even number data. Signal Processing, 85(7), 1429–1434. https://doi.org/10.1016/j.sigpro.2005.02.002
Ma, Z., Tao, Z., Du, X., Yu, Y., & Wu, C. (2021). Automatic detection of crop root rows in paddy fields based on straight-line clustering algorithm and supervised learning method. Biosystems Engineering, 211, 63–76. https://doi.org/10.1016/j.biosystemseng.2021.08.030
Meyer, G. E., Neto, J. C., Jones, D. D., & Hindman, T. W. (2004). Intensified fuzzy clusters for classifying plant, soil, and residue regions of interest from color images. Computers and Electronics in Agriculture, 42(3), 161–180. https://doi.org/10.1016/j.compag.2003.08.002
Mousazadeh, H. (2013). A technical review on navigation systems of agricultural autonomous off-road vehicles. Journal of Terramechanics, 50(3), 211–232. https://doi.org/10.1016/j.jterra.2013.03.004
Otsu, N. (1979). A threshold selection method from gray-level histograms. IEEE Transactions on Systems, Man, and Cybernetics, 9(1), 62–66. https://doi.org/10.1109/tsmc.1979.4310076
Pla, F., Sanchiz, J. M., Marchant, J. A., & Brivot, R. (1997). Building perspective models to guide a row crop navigation vehicle. Image and Vision Computing, 15(6), 465–473. https://doi.org/10.1016/s0262-8856(96)01147-x
Ponnambalam, V. R., Bakken, M., Moore, R. J. D., Glenn Omholt Gjevestad, J., & Johan From, P. (2020). Autonomous crop row guidance using adaptive multi-ROI in strawberry fields. Sensors, 20(18), 5249. https://doi.org/10.3390/s20185249
Ronneberger, O., Fischer, P., & Brox, T. (2015). U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical Image Computing and Computer-assisted Intervention (pp. 234–241). https://doi.org/10.1007/978-3-319-24574-4_28
Vidović, I., Cupec, R., & Hocenski, Z. (2016). Crop row detection by global energy minimization. Pattern Recognition, 55, 68–86. https://doi.org/10.1016/j.patcog.2016.01.013
Wang, A. C., Xu, Y. F., Wei, X. H., & Cui, B. B. (2020). Semantic segmentation of crop and weed using an encoder-decoder network and image enhancement method under uncontrolled outdoor illumination. Ieee Access, 8, 81724–81734. https://doi.org/10.1109/access.2020.2991354
Yu, Y., Bao, Y., Wang, J., Chu, H., Zhao, N., He, Y., et al. (2021). Crop row segmentation and detection in paddy fields based on treble-classification Otsu and double-dimensional clustering method. Remote Sensing, 13(5), 901. https://doi.org/10.3390/rs13050901
Zhang, X., Li, X., Zhang, B., Zhou, J., Tian, G., Xiong, Y., et al. (2018). Automated robust crop-row detection in maize fields based on position clustering algorithm and shortest path method. Computers and Electronics in Agriculture, 154, 165–175. https://doi.org/10.1016/j.compag.2018.09.014
Zhang, Q., Chen, M. E. S., & Li, B. (2017). A visual navigation algorithm for paddy field weeding robot based on image understanding. Computers and Electronics in Agriculture, 143, 66–78. https://doi.org/10.1016/j.compag.2017.09.008
Zhao, H., Shi, J., Qi, X., Wang, X., & Jia, J. (2017). Pyramid scene parsing network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 2881–2890). https://doi.org/10.1109/cvpr.2017.660
Funding
The authors gratefully acknowledge project funding provided by the Zhejiang key research and development project in China (Grant No. 2022C02005).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Li, D., Dong, C., Li, B. et al. Hybrid rice row detection at the pollination stage based on vanishing point and line-scanning method. Precision Agric 24, 921–947 (2023). https://doi.org/10.1007/s11119-022-09980-6
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11119-022-09980-6