Abstract
Plant pests have a negative effect on crop yields. If the various insect pests are not identified and controlled properly, they can spread quickly and cause a significant decline in agricultural production. To overcome the challenges, the convolutional neural network (CNN)-based methods have shown excellent performance as it performs automatic feature extraction in image identification and classification. In this study, to enhance the learning capability for pest images with cluttered backgrounds, the MobileNet-V2 pre-trained on ImageNet was chosen as the backbone network and the attention mechanism along with a classification activation map (CAM) were incorporated in our architecture to learn the significant pest information of input images. Moreover, the optimized loss function and two-stage transfer learning were adopted in model training. This kind of progressive learning first makes the model discover the large-scale structures, and then shifts its attention to delicate details step by step, improving the identification accuracy of plant pest images. The proposed procedure achieves an average accuracy of 99.14% on the publicly available dataset, and even in heterogeneous background conditions, the average accuracy also reaches 92.79%. Experimental results prove the efficacy of the proposed procedure, and it delivers outperformance compared with other state-of-the-art methods.
Similar content being viewed by others
References
Anderson P, He X, Buehler C, Teney D, Johnson M, Gould S, Zhang L (2018) Bottom-up and top-down attention for image captioning and visual question answering. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 6077–6086
Chen J, Wang W, Zhang D, Zeb A, Nanehkaran YA (2020) Attention embedded lightweight network for maize disease recognition. Plant Pathol. https://doi.org/10.1111/ppa.13322
Deng L, Wang Y, Han Z, Yu R (2018) Research on insect pest image detection and recognition based on bio-inspired methods. Biosys Eng 169:139–148
Faithpraise F, Birch P, Young R, Obu J, Faithpraise B, Chatwin C (2013) Automatic plant pest detection and recognition using k-means clustering algorithm and correspondence filters. Int J Adv Biotechnol Res 4:189–199
Gadekallu TR, Rajput DS, Reddy MPK, Lakshmanna K, Bhattacharya S, Singh S, Alazab M (2020) A novel PCA–whale optimization-based deep neural network model for classification of tomato plant diseases using GPU. J Real-Time Image Proc 342:1–14
Gassoumi H, Prasad NR, Ellington JJ (2000) Neural network-based approach for insect classification in cotton ecosystems. In: International Conference on Intelligent Technologies, pp 13–15
Ghazi MM, Yanikoglu B, Aptoula E (2017) Plant identification using deep neural networks via optimization of transfer learning parameters. Neurocomputing 235:228–235
Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Bengio Y (2014) Generative adversarial nets. In: Advances in neural information processing systems, pp 2672–2680
Hayashi M, Tamai K, Owashi Y, Miura K (2019) Automated machine learning for identification of pest aphid species (Hemiptera: Aphididae). Appl Entomol Zool 54:487–490
He K, Zhang X, Ren S, Sun J (2015) Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: Proceedings of the IEEE international conference on computer vision, pp1026–1034
Howard AG, Zhu M, Chen B, Kalenichenko D, Wang W, Weyand T, Adam H (2017) Mobilenets: efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:170404861
Huang G, Liu Z, Van Der Maaten L, Weinberger K Q (2017) Densely connected convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4700–4708
Kaiser L, Gomez AN, Chollet F (2017) Depthwise separable convolutions for neural machine translation. arXiv preprint arXiv:170603059
Karras T, Aila T, Laine S, Lehtinen J (2017) Progressive growing of gans for improved quality stability and variation. arXiv preprint arXiv:171010196
Kessentini Y, Besbes MD, Ammar S, Chabbouh A (2019) A two-stage deep neural network for multi-norm license plate detection and recognition. Expert Syst Appl 136:159–170
Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. arXiv preprint arXiv:14126980
Li M, Zhang Y, Huang M, Chen J, Feng W (2019) Named entity recognition in chinese electronic medical record using attention mechanism. In: 2019 International Conference on Internet of Things (iThings) and IEEE Green Computing and Communications (GreenCom) and IEEE Cyber Physical and Social Computing (CPSCom) and IEEE Smart Data (SmartData), pp 649–654, IEEE
Li Y, Yang J (2020) Few-shot cotton pest recognition and terminal realization. Comput Electron Agric 169:105240
Lin TY, Goyal P, Girshick R, He K, Dollár P (2017) Focal loss for dense object detection. In: Proceedings of the IEEE international conference on computer vision, pp 2980–2988
Liu Z, Gao J, Yang G, Zhang H, He Y (2016) Localization and classification of paddy field pests using a saliency map and deep convolutional neural network. Sci Rep 6:1–12
Liu X, Jia Z, Hou X, Fu M, Ma L, Sun Q (2019) Real-time marine animal images classification by embedded system based on mobilenet and transfer learning. In: OCEANS 2019-Marseille, pp 1–5
Nan Y, Xi W (2019) Classification of press plate image based on attention mechanism. In: 2019 2nd International conference on safety produce informatization (IICSPI), pp 129–132, IEEE
Nanni L, Maguolo G, Pancino F (2020) Insect pest image detection and recognition based on bio-inspired methods. Ecol Inform 57:101089
Nazki H, Yoon S, Fuentes A, Park DS (2020) Unsupervised image translation using adversarial networks for improved plant disease recognition. Comput Electron Agric 168:105117
Pan SJ, Yang Q (2009) A survey on transfer learning. IEEE Trans Knowl Data Eng 22:1345–1359
Picon A, Seitz M, Alvarez-Gila A, Mohnke P, Ortiz-Barredo A, Echazarra J (2019) Crop conditional convolutional neural networks for massive multi-crop plant disease classification over cell phone acquired images taken on real field conditions. Comput Electron Agric 167:105093
Rabano SL, Cabatuan MK, Sybingco E, Dadios EP, Calilung EJ (2018) Common garbage classification using mobilenet. In: 2018 IEEE 10th International Conference on Humanoid Nanotechnology Information Technology Communication and Control Environment and Management (HNICEM), pp 1–4, IEEE
Radford A, Metz L, Chintala S (2015) Unsupervised representation learning with deep convolutional generative adversarial networks. arXiv preprint arXiv:151106434
Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Berg AC (2015) Imagenet large scale visual recognition challenge. Int J Comput Vision 115:211–252
Sandler M, Howard A, Zhu M, Zhmoginov A, Chen LC (2018) Mobilenetv2: inverted residuals and linear bottlenecks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4510–4520
Shah MA, Khan AA (2014) Imaging techniques for the detection of stored product pests. Appl Entomol Zool 49:201–212
Shen Y, Sun H, Xu X, Zhou J (2019) Detection and positioning of surface defects on galvanized sheet based on improved MobileNet v2. In: 2019 Chinese Control Conference (CCC), pp 8450–8454, IEEE
Shijie J, Peiyi J, Siping H (2017) Automatic detection of tomato diseases and pests based on leaf images. In: 2017 Chinese Automation Congress (CAC), pp 2537–2510, IEEE
Sifre L, Mallat S (2014) Rigid-motion scattering for image classification. PhD thesis
Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:14091556
Tan M, Le Q V (2019) Efficientnet: rethinking model scaling for convolutional neural networks. arXiv preprint arXiv:190511946
Thenmozhi K, Reddy US (2019) Crop pest classification based on deep convolutional neural network and transfer learning. Comput Electron Agric 164:104906
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Polosukhin I (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008
Wang F, Jiang M, Qian C, Yang S, Li C, Zhang H, Tang X (2017) Residual attention network for image classification. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 3156–3164
Wang J, Lin C, Ji L, Liang A (2012) A new automatic identification system of insect images at the order level. Knowl-Based Syst 33:102–110
Wen C, Guyer DE, Li W (2009) Local feature-based identification and classification for orchard insects. Biosys Eng 104:299–307
Wen C, Wu D, Hu H, Pan W (2015) Pose estimation-dependent identification method for field moth images using deep learning architecture. Biosyst Eng 136:117–128
Woo S, Park J, Lee JY, So Kweon I (2018) Cbam: convolutional block attention module. In: Proceedings of the European conference on computer vision (ECCV), pp 3–19
Zhou B, Khosla A, Lapedriza A, Oliva A, Torralba A (2016) Learning deep features for discriminative localization. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2921–2929
Zoph B, Vasudevan V, Shlens J, Le QV (2018) Learning transferable architectures for scalable image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 8697–8710
Acknowledgements
The work is supported by the National Natural Science Foundation of China under grant No. 61672439 and the Fundamental Research Funds for the Central Universities No. 20720181004. The authors also would like to thank the editors and all the anonymous reviewers for their constructive advice.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no conflicts of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Chen, J., Chen, W., Zeb, A. et al. Crop pest recognition using attention-embedded lightweight network under field conditions. Appl Entomol Zool 56, 427–442 (2021). https://doi.org/10.1007/s13355-021-00732-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13355-021-00732-y