Abstract
Accurate chemical thinning of apple trees requires estimation of their blooming intensity, and determination of the blooming peak date. Performing this task, as of today, requires human experts to be present in the orchards for the entire blossom period or extrapolate using a single observation. Since experts are rare and in high demand, there is a need to automate this process. The system presented in this paper is able to estimate the blooming intensity and the blooming peak date from a sequence of tree images, with close-to-human accuracy. For this purpose, a two years dataset was collected in 2014–2015, partially tagged for the flowers location and completely annotated for blooming intensity. Using this dataset, an algorithm was developed and trained with three stages: a visual flower detector based on a deep convolutional neural network, followed by a blooming level estimator, and a peak blooming day finding algorithm. Despite the challenging conditions, the trained detector was able to detect flowers on trees with an Average Precision (AP) score of 0.68, which is on a par with contemporary results of other objects in detection benchmarks. The blooming estimator was based on a linear regression component, which used the number of flowers detected and related statistics to estimate the blooming intensity. The Pearson correlation between the algorithm blooming estimation and human judgments of several experts indicated high agreement levels (0.78–0.93) which were similar to the correlations measured among the human experts. Moreover, the developed estimator was relatively stable across multiple years. The developed peak date finding algorithm identified correctly the orchard’s blooming peak date, which was used to determine the thinning date in the current practice (the entire orchard is thinned in the same day). Experiments testing the algorithm’s ability to find a blooming peak date for each tree independently showed encouraging results, which may lead upon refinement to a more precise practice for tree-specific thinning.
Similar content being viewed by others
References
Adamsen, F. J., Coffelt, T. A., Nelson, J. M., Barnes, E. M., & Rice, R. C. (2000). Method for using images from a color digital camera to estimate flower number. Crop Science,40(3), 704–709.
Aggelopoulou, A. D., Bochtis, D., Fountas, S., Swain, K. C., Gemtos, T. A., & Nanos, G. D. (2011). Yield prediction in apple orchards based on image processing. Precision Agriculture,12(3), 448–456.
Bargoti, S., & Underwood, J. (2017, May). Deep fruit detection in orchards. In IEEE International Conference on Robotics and Automation (ICRA), 2017 (pp. 3626–3633). IEEE.
Deng, J., Dong, W., Socher, R., Li, L. J., Li, K., & Fei-Fei, L. (2009). Imagenet: A large-scale hierarchical image database. In IEEE Conference on Computer Vision and Pattern Recognition, 2009. CVPR 2009. (pp. 248–255). IEEE.
Dennis, F. J. (2000). The history of fruit thinning. Plant Growth Regulation,31(1–2), 1–16.
Dobrescu, A., Giuffrida, M. V., & Tsaftaris, S. A. (2017). Leveraging multiple datasets for deep leaf counting. arXiv preprint arXiv:1709.01472.
Everingham, M., Van Gool, L., Williams, C. K., Winn, J., & Zisserman, A. (2010). The pascal visual object classes (voc) challenge. International Journal of Computer Vision, 88(2), 303–338.
Forshey, C. (1986). Chemical fruit thinning of apples. New York’s food and life science bulletin, bulletin number 116. Published by New York State Agricultural Experiment Station. Retrieved from https://ecommons.cornell.edu/bitstream/handle/1813/5147/FLS-116.pdf?sequence=1&isAllowed=y. Last Accessed June 2019.
Gongal, A., Amatya, S., Karkee, M., Zhang, Q., & Lewis, K. (2015). Sensors and systems for fruit detection and localization: A review. Computers and Electronics in Agriculture,116, 8–19.
Gourley, J. H. (1922). Textbook of Pomology. New-York: Macmillan.
Hočevar, M., Širok, B., Godeša, T., & Stopar, M. (2014). Flowering estimation in apple orchards by image analysis. Precision Agriculture,15(4), 466–478.
Hosang, J., Benenson, R., & Schiele, B. (2017). Learning non-maximum suppression. arXiv preprint arXiv:1705.02950.
Kapach, K., Barnea, E., Mairon, R., Edan, Y., & Ben-Shahar, O. (2012). Computer vision for fruit harvesting robots–state of the art and challenges ahead. International Journal of Computational Vision and Robotics,3(1–2), 4–34.
Koen, B. V. (1985). Definition of the Engineering Method. Washington, DC: ASEE Publications.
Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). ImageNet Classification with Deep Convolutional Neural Networks. Advances in Neural Information Processing Systems,25, 1097–1105.
Link, H. (2000). Significance of flower and fruit thinning on fruit quality. Plant Growth Regulation,31(1–2), 17–26.
Linker, R., Cohen, O., & Naor, A. (2012). Determination of the number of green apples in RGB images recorded in orchards. Computers and Electronics in Agriculture,81, 45–57.
Long, J., Shelhamer, E., & Darrell, T. (2015). Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 3431–3440).
Minervini, M., Fischbach, A., Scharr, H., & Tsaftaris, S. A. (2016). Finely-grained annotated datasets for image-based plant phenotyping. Pattern Recognition Letters,81, 80–89.
Mohanty, S. P., Hughes, D. P., & Salathé, M. (2016). Using deep learning for image-based plant disease detection. Frontiers in Plant Science,7, 1419.
Potena, C., Nardi, D., & Pretto, A. (2016, July). Fast and accurate crop and weed identification with summarized train sets for precision agriculture. In International Conference on Intelligent Autonomous Systems (pp. 105–121).
Ren, S., He, K., Girshick, R., & Sun, J. (2015). Faster R-CNN: Towards real-time object detection with region proposal networks. In Advances in neural information processing systems (pp. 91–99).
Robinson, T. L., Lakso, A. N., & Hoying, S. A. (2010, August). Advances in predicting chemical thinner response of apple using a MaluSim carbon balance model. In XXVIII International Horticultural Congress on Science and Horticulture for People (IHC2010): International Symposium on Plant 932 (pp. 223–229).
Sa, I., Ge, Z., Dayoub, F., Upcroft, B., Perez, T., & McCool, C. (2016). Deepfruits: A fruit detection system using deep neural networks. Sensors,16(8), 1222.
Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.
Wang, Q., Nuske, S., Bergerman, M., & Singh, S. (2013). Automated crop yield estimation for apple orchards. In Experimental robotics (pp. 745–758).
Yoder, K., Yuan, R., Combs, L., Byers, R., McFerson, J., & Schmidt, T. (2009). Effects of temperature and the combination of liquid lime sulfur and fish oil on pollen germination, pollen tube growth, and fruit set in apples. HortScience,44(5), 1277–1283.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Farjon, G., Krikeb, O., Hillel, A.B. et al. Detection and counting of flowers on apple trees for better chemical thinning decisions. Precision Agric 21, 503–521 (2020). https://doi.org/10.1007/s11119-019-09679-1
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11119-019-09679-1