Skip to main content
Log in

Multi-species weed density assessment based on semantic segmentation neural network

  • Published:
Precision Agriculture Aims and scope Submit manuscript

Abstract

The precise use of specific types of herbicides according to the weed species and density in fields can effectively reduce chemical contamination. A weed species and density evaluation method based on an image semantic segmentation neural network was proposed in this paper. A combination of pre-training and fine-tuning training methods was used to train the network. The pre-training data were images that only contain one species of weeds in one image. The weeds were automatically labeled by an image segmentation method based on the Excess Green (ExG) and the minimum error threshold. The fine-tuning dataset was real images containing multiple weeds and crops and manually labeled. Due to the limitation of computational resources, larger images were difficult to segment at one time. Therefore, this paper proposed a method of cutting images into sub-images. The relationship between sub-image size and segment accuracy was studied. The results showed that the training method reduced the workload of labeling training data while effectively avoiding overfitting. The accuracy of image segmentation decreases as the sub-image size decreases. Considering the limitation of computational resources, a subgraph of \(256 \times 256\) size was selected. The semantic segmentation network achieved 97% overall accuracy. The coefficient of determination (\(R^2\)) of weed density calculated by the algorithm and manually assessed was 0.90, and the root means square error (\(\sigma \)) was 0.05. The method could effectively assess the density of each species of weeds in complex environments. It could provide a reference for accurate spraying herbicides based on weed density and species.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

References

  • Abdalla, A., Cen, H., Wan, L., Rashid, R., Weng, H., Zhou, W., & He, Y. (2019). Fine-tuning convolutional neural network with transfer learning for semantic segmentation of ground-level oilseed rape images in a field with high weed pressure. Computers and Electronics in Agriculture, 167, 105091.

    Article  Google Scholar 

  • Alenya, G., Dellen, B., Foix, S., & Torras, C. (2013). Robotized plant probing: Leaf segmentation utilizing time-of-flight data. IEEE Robotics & Automation Magazine, 20(3), 50–59.

    Article  Google Scholar 

  • Aversano, L., Bernardi, M. L., Cimitile, M., Iammarino, M., & Rondinella, S. (2020). Tomato diseases classification based on vgg and transfer learning. In IEEE international workshop on metrology for agriculture and forestry (MetroAgriFor) (pp. 129–133). IEEE.

  • Badrinarayanan, V., Kendall, A., & Cipolla, R. (2017). Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495.

    Article  PubMed  Google Scholar 

  • Bakhshipour, A., & Jafari, A. (2018). Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Computers and Electronics in Agriculture, 145, 153–160.

    Article  Google Scholar 

  • Barbedo, J. G. A. (2018). Impact of dataset size and variety on the effectiveness of deep learning and transfer learning for plant disease classification. Computers and Electronics in Agriculture, 153, 46–53.

    Article  Google Scholar 

  • Berge, T., Aastveit, A., & Fykse, H. (2008). Evaluation of an algorithm for automatic detection of broad-leaved weeds in spring cereals. Precision Agriculture, 9(6), 391–405.

    Article  Google Scholar 

  • Bosilj, P., Aptoula, E., Duckett, T., & Cielniak, G. (2020). Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture. Journal of Field Robotics, 37(1), 7–19.

    Article  Google Scholar 

  • Champ, J., Mora-Fallas, A., Goëau, H., Mata-Montero, E., Bonnet, P., & Joly, A. (2020). Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots. Applications in Plant Sciences, 8(7), e11373.

    Article  PubMed  PubMed Central  Google Scholar 

  • Chen, Y., Wu, Z., Zhao, B., Fan, C., & Shi, S. (2021). Weed and corn seedling detection in field based on multi feature fusion and support vector machine. Sensors, 21(1), 212.

    Article  CAS  Google Scholar 

  • Cubuk, E. D., Zoph, B., Shlens, J., Le, Q. V.(2020). Randaugment: Practical automated data augmentation with a reduced search space. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops (pp. 702–703).

  • Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). Imagenet: A large-scale hierarchical image database. In IEEE conference on computer vision and pattern recognition (pp. 248–255). IEEE.

  • Deng, Z., Sun, H., Zhou, S., Zhao, J., Lei, L., & Zou, H. (2018). Multi-scale object detection in remote sensing imagery with convolutional neural networks. ISPRS Journal of Photogrammetry and Remote Sensing, 145, 3–22.

    Article  Google Scholar 

  • DeVries, T., & Taylor, G. W. Improved regularization of convolutional neural networks with cutout. arXiv:1708.04552.

  • Dyrmann, M., Karstoft, H., & Midtiby, H. S. (2016). Plant species classification using deep convolutional neural network. Biosystems Engineering, 151, 72–80.

    Article  Google Scholar 

  • Ferreira, A. D. S., Freitas, D. M., Silva, G. G. D., Pistori, H., & Folhes, M. T. (2017). Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture, 143, 314–324.

    Article  Google Scholar 

  • Fu, L., Gao, F., Wu, J., Li, R., Karkee, M., & Zhang, Q. (2020). Application of consumer rgb-d cameras for fruit detection and localization in field: A critical review. Computers and Electronics in Agriculture, 177, 105687.

    Article  Google Scholar 

  • Ge, L., Yang, Z., Sun, Z., Zhang, G., Zhang, M., Zhang, K., et al. (2019). A method for broccoli seedling recognition in natural environment based on binocular stereo vision and gaussian mixture model. Sensors, 19(5), 1132.

    Article  PubMed  PubMed Central  Google Scholar 

  • Ghasemi, A., & Zahediasl, S. (2012). Normality tests for statistical analysis: A guide for non-statisticians. International journal of endocrinology and metabolism, 10(2), 486.

    Article  PubMed  PubMed Central  Google Scholar 

  • Gongal, A., Amatya, S., Karkee, M., Zhang, Q., & Lewis, K. (2015). Sensors and systems for fruit detection and localization: A review. Computers and Electronics in Agriculture, 116, 8–19.

    Article  Google Scholar 

  • Hamuda, E., Glavin, M., & Jones, E. (2016). A survey of image processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture, 125, 184–199.

    Article  Google Scholar 

  • Hsu, C.-Y., Shao, L.-J., Tseng, K.-K., & Huang, W.-T. (2019). Moon image segmentation with a new mixture histogram model. Enterprise Information Systems, 1–24.

  • Huang, H., Deng, J., Lan, Y., Yang, A., Deng, X., Wen, S., et al. (2018). Accurate weed mapping and prescription map generation based on fully convolutional networks using uav imagery. Sensors, 18(10), 3299.

    Article  PubMed  PubMed Central  Google Scholar 

  • Huang, H., Lin, L., Tong, R., Hu, H., Zhang, Q., Iwamoto, Y., Han, X., Chen, Y.-W., & Wu, J. (2020). Unet 3+: A full-scale connected unet for medical image segmentation. In ICASSP 2020-2020 IEEE international conference on acoustics, speech and signal processing (ICASSP) (pp. 1055–1059). IEEE.

  • Jin, X., Che, J., & Chen, Y. (2021). Weed identification using deep learning and image processing in vegetable plantation. IEEE Access, 9, 10940–10950.

    Article  Google Scholar 

  • Kalin, U., Lang, N., Hug, C., Gessler, A., & Wegner, J. D. (2019). Defoliation estimation of forest trees from ground-level images. Remote Sensing of Environment, 223, 143–153.

    Article  Google Scholar 

  • Kamilaris, A., & Prenafetaboldu, F. X. (2018). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147, 70–90.

    Article  Google Scholar 

  • Kazmi, W., Foix, S., Alenya, G., & Andersen, H. J. (2014). Indoor and outdoor depth imaging of leaves with time-of-flight and stereo vision sensors: Analysis and comparison. ISPRS Journal of Photogrammetry and Remote Sensing, 88, 128–146.

    Article  Google Scholar 

  • Kazmi, W., Garcia-Ruiz, F., Nielsen, J., Rasmussen, J., & Andersen, H. J. (2015). Exploiting affine invariant regions and leaf edge shapes for weed detection. Computers and Electronics in Agriculture, 118, 290–299.

    Article  Google Scholar 

  • Kemker, R., Salvaggio, C., & Kanan, C. (2018). Algorithms for semantic segmentation of multispectral remote sensing imagery using deep learning. ISPRS Journal of Photogrammetry and Remote Sensing, 145, 60–77.

    Article  Google Scholar 

  • Khan, A., Ilyas, T., Umraiz, M., Mannan, Z. I., & Kim, H. (2020). CED-Net: Crops and weeds segmentation for smart farming using a small cascaded encoder-decoder architecture. Electronics, 9(10), 1602.

    Article  Google Scholar 

  • Khan, M. J., Khan, H. S., Yousaf, A., Khurshid, K., & Abbas, A. (2018). Modern trends in hyperspectral image analysis: A review. IEEE Access, 6, 14118–14129.

    Article  Google Scholar 

  • Kusumam, K., Krajník, T., Pearson, S., Duckett, T., & Cielniak, G. (2017). 3D‐vision based detection, localization, and sizing of broccoli heads in the field. Journal of Field Robotics, 34(8), 1505−1518.

  • Lammie, C., Olsen, A., Carrick, T., & Azghadi, M. R. (2019). Low-power and high-speed deep fpga inference engines for weed classification at the edge. IEEE Access, 7, 51171–51184.

    Article  Google Scholar 

  • Le, V. N. T.,Truong, G., & Alameh, K. (2021). Detecting weeds from crops under complex field environments based on faster RCNN. In 2020 IEEE eighth international conference on communications and electronics (ICCE) (pp. 350–355). IEEE.

  • Lee, M.-K., Golzarian, M. R., & Kim, I. (2021). A new color index for vegetation segmentation and classification. Precision Agriculture, 22(1), 179–204.

    Article  Google Scholar 

  • Li, J., & Tang, L. (2018). Crop recognition under weedy conditions based on 3d imaging for robotic weed control. Journal of Field Robotics, 35(4), 596–611.

    Article  Google Scholar 

  • Lottes, P., Behley, J., Milioto, A., & Stachniss, C. (2018). Fully convolutional networks with sequential information for robust crop and weed detection in precision farming. IEEE Robotics and Automation Letters, 3(4), 2870–2877.

    Article  Google Scholar 

  • Meyer, G. E., & Neto, J. C. (2008). Verification of color vegetation indices for automated crop imaging applications. Computers and Electronics in Agriculture, 63(2), 282–293.

    Article  Google Scholar 

  • Pan, B., Shi, Z., & Xu, X. (2017). Mugnet: Deep learning for hyperspectral image classification using limited samples. ISPRS Journal of Photogrammetry and Remote Sensing, 145, 108–119.

    Article  Google Scholar 

  • Picon, A., Alvarezgila, A., Seitz, M., Ortizbarredo, A., Echazarra, J., & Johannes, A. (2019). Deep convolutional neural networks for mobile capture device-based crop disease classification in the wild. Computers and Electronics in Agriculture, 161, 280–290.

    Article  Google Scholar 

  • Rico-Fernández, M., Rios-Cabrera, R., Castelan, M., Guerrero-Reyes, H.-I., & Juarez-Maldonado, A. (2019). A contextualized approach for segmentation of foliage in different crop species. Computers and Electronics in Agriculture, 156, 378–386.

    Article  Google Scholar 

  • Rodrigo, M., Oturan, N., & Oturan, M. A. (2014). Electrochemically assisted remediation of pesticides in soils and water: A review. Chemical Reviews, 114(17), 8720–8745.

    Article  CAS  PubMed  Google Scholar 

  • Sabzi, S., Abbaspour-Gilandeh, Y., & García-Mateos, G. (2018). A fast and accurate expert system for weed identification in potato crops using metaheuristic algorithms. Computers in Industry, 98, 80–89.

    Article  Google Scholar 

  • Stroppiana, D., Villa, P., Sona, G., Ronchetti, G., Candiani, G., Pepe, M., et al. (2018). Early season weed mapping in rice crops using multi-spectral uav data. International Journal of Remote Sensing, 39(15–16), 5432–5452.

    Article  Google Scholar 

  • Strudel, R., Garcia, R., Laptev, I., & Schmid, C. (2021). Segmenter: Transformer for semantic segmentation. In Proceedings of the IEEE/CVF international conference on computer vision (pp. 7262–7272).

  • Su, D., Qiao, Y., Kong, H., & Sukkarieh, S. (2021). Real time detection of inter-row ryegrass in wheat farms using deep learning. Biosystems Engineering, 204, 198–211.

    Article  CAS  Google Scholar 

  • Tiwari, O., Goyal, V., Kumar, P., Vij, S. (2019). An experimental set up for utilizing convolutional neural network in automated weed detection. In 2019 4th international conference on internet of things: Smart innovation and usages (IoT-SIU) (pp. 1–6). IEEE.

  • Too, E. C., Yujian, L., Njuki, S., & Yingchun, L. (2019). A comparative study of fine-tuning deep learning models for plant disease identification. Computers and Electronics in Agriculture, 161, 272–279.

    Article  Google Scholar 

  • Virtanen, P., Gommers, R., Oliphant, T. E., Haberland, M., Reddy, T., Cournapeau, D., Burovski, E., Peterson, P., Weckesser, W., Bright, J., van der Walt, S. J., Brett, M., Wilson, J., Millman, K. J., Mayorov, N., Nelson, A. R. J., Jones, E., Kern, R., Larson, E., Carey, C. J., Polat, İ., Feng, Y., Moore, E. W., VanderPlas, J., Laxalde, D., Perktold, J., Cimrman, R., Henriksen, I., Quintero, E. A., Harris, C. R., Archibald, A. M., Ribeiro, A. H., Pedregosa, F., & van Mulbregt, P. (2020). SciPy 1.0 contributors, SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nature Methods, 17, 261–272. https://doi.org/10.1038/s41592-019-0686-2.

  • Wang, A., Zhang, W., & Wei, X. (2019). A review on weed detection using ground-based machine vision and image processing techniques. Computers and Electronics in Agriculture, 158, 226–240.

    Article  Google Scholar 

  • Yun, S., Han, D., Oh, S. J., Chun, S., Choe, J., & Yoo, Y. (2019). Cutmix: Regularization strategy to train strong classifiers with localizable features. In Proceedings of the IEEE/CVF international conference on computer vision (pp. 6023–6032).

  • Zhang, C., Zou, K., & Pan, Y. (2020). A method of apple image segmentation based on color-texture fusion feature and machine learning. Agronomy, 10(7), 972.

    Article  Google Scholar 

  • Zhang, H., Cisse, M., Dauphin, Y. N., & Lopez-Paz, D. mixup: Beyond empirical risk minimization. arXiv:1710.09412.

  • Zhang, S., Huang, W., & Wang, Z. (2021). Combing modified grabcut, k-means clustering and sparse representation classification for weed recognition in wheat field. Neurocomputing, 452, 665–674.

    Article  Google Scholar 

  • Zou, K., Chen, X., Wang, Y., Zhang, C., & Zhang, F. (2021). A modified u-net with a specific data argumentation method for semantic segmentation of weed images in the field. Computers and Electronics in Agriculture, 187, 106242.

    Article  Google Scholar 

  • Zou, K., Ge, L., Zhang, C., Yuan, T., & Li, W. (2019). Broccoli seedling segmentation based on support vector machine combined with color texture features. IEEE Access, 7, 168565–168574.

    Article  Google Scholar 

Download references

Acknowledgements

This research was funded by National Key Research and Development Project (2019YFB1312303).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chunlong Zhang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zou, K., Wang, H., Yuan, T. et al. Multi-species weed density assessment based on semantic segmentation neural network. Precision Agric 24, 458–481 (2023). https://doi.org/10.1007/s11119-022-09953-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11119-022-09953-9

Keywords

Navigation