Skip to main content

Advertisement

Log in

Unmanned aerial vehicle images in the machine learning for agave detection

  • Ecosystems for Future Generations
  • Published:
Environmental Science and Pollution Research Aims and scope Submit manuscript

Abstract

In this study, six supervised classification algorithms were compared. The algorithms were based on cluster analysis, distance, deep learning, and object-based image analysis. Our objective was to determine which of these algorithms has the highest overall accuracy in both detection and automated estimation of agave cover in a given area to help growers manage their plantations. An orthomosaic with a spatial resolution of 2.5 cm was derived from 300 images obtained with a DJI Inspire 1 unmanned aerial system. Two training classes were defined: (1) sites where the presence of agaves was identified and (2) “absence” where there were no agaves but other plants were present. The object-oriented algorithm was found to have the highest overall accuracy (0.963), followed by the support-vector machine with 0.928 accuracy and the neural network with 0.914. The algorithms with statistical criteria for classification were the least accurate: Mahalanobis distance = 0.752 accuracy and minimum distance = 0.421. We further recommend that the object-oriented algorithm be used, because in addition to having the highest overall accuracy for the image segmentation process, it yields parameters that are useful for estimating the coverage area, size, and shapes, which can aid in better selection of agave individuals for harvest.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data availability

Not applicable.

References 

Download references

Acknowledgements

We would like to acknowledge Miss Gelvin Isela Gámiz Romero and Mr. Jose Cruz for allowing us to work on his property.

Funding

Access funding is provided by Instituto Politecnico Nacional, CIIDIR Unidad Durango, Project Number 20210008.

Author information

Authors and Affiliations

Authors

Contributions

JE and SS provide a substantial contribution to the conception and design of this mini-review. JE is the major contributor of literature search and data analysis for this mini-review. SS analyzed and interpreted the data for this mini-review article. JE analyzed and interpreted the data for this mini-review article. SS is the major contributor in drafting this mini-review. EG contributed to critically revising the article for important intellectual content. All the authors read and approved the final version of this mini-review and agree to approve for sending for publication in this esteemed journal.

Corresponding author

Correspondence to Sarahi Sandoval.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Responsible Editor: Philippe Garrigues

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Escobar-Flores, J.G., Sandoval, S. & Gámiz-Romero, E. Unmanned aerial vehicle images in the machine learning for agave detection. Environ Sci Pollut Res 29, 61662–61673 (2022). https://doi.org/10.1007/s11356-022-18985-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11356-022-18985-7

Keywords

Navigation