Skip to main content
Log in

Recyclable solid waste detection based on image fusion and convolutional neural network

  • ORIGINAL ARTICLE
  • Published:
Journal of Material Cycles and Waste Management Aims and scope Submit manuscript

Abstract

The most solid waste image datasets usually contain only a single object with a plain background, which is quite different from the real environment. In addition, the waste images labeling process takes a long time and is labor cost. To address these problems, we proposed an effective method to extend the dataset based on image fusion. Herein, we use image fusion technology to make a recyclable solid waste dataset Trash-Fusion automatically, where the images contain different categories of objects with complex background, and all classification and location labels are collected in the process of image fusion. Moreover, an actual scene dataset Trash-Collect is constructed, images of which are downloaded from the Internet or collected by ourselves. A mixed dataset of Trash-Fusion and Trash-Collect is sent to several convolutional neural networks for training, and YOLO v5 achieves the highest detection precision with 60 FPS.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig.10

Similar content being viewed by others

Abbreviations

MPP::

Morphological post-processing

ROI::

Region of interest

GT::

Ground True

IoU::

Intersection over union

mIoU::

Mean Intersection over Union

MR::

Mix ratio

NF::

The number of images from Trash-Fusion dataset

NC::

The number of images from Trash-Collect dataset

NT::

The total number of images in the mixed training set

FPS::

Frames Per Second

AP::

Average Precision

mAP::

Mean Average Precision

mAP50::

The mAP when the IoU is 0.5

mAP50:95::

The mean of mAPs from the IoU threshold from 0.5 to 0.95

References

  1. Kaza S, Yao LC, Bhada-Tata P, Van Woerden F (2018) What a Waste 2.0: a global snapshot of solid waste management to 2050. Washington, DC: World Bank

  2. Waste and recycling. https://ec.europa.eu/environment/topics/waste-and-recycling_en. Accessed 17 Mar 2022

  3. Arebey M, Hannan MA, Begum RA, Basri H (2012) Solid waste bin level detection using gray level co-occurrence matrix feature extraction approach. J Environ Manage 104:9–18. https://doi.org/10.1016/j.jenvman.2012.03.035

    Article  Google Scholar 

  4. Wang C, Hu Z, Pang Q, Hua L (2019) Research on the classification algorithm and operation parameters optimization of the system for separating non-ferrous metals from end-of-life vehicles based on machine vision. Waste Manage 100:10–17. https://doi.org/10.1016/j.wasman.2019.08.043

    Article  Google Scholar 

  5. He K, Zhang X, Ren S, Sun J (2015) Deep residual learning for image recognition. arXiv:151203385 [cs]

  6. Zhang Q, Zhang X, Mu X et al (2021) Recyclable waste image recognition based on deep learning. Resour Conserv Recycl 171:105636. https://doi.org/10.1016/j.resconrec.2021.105636

    Article  Google Scholar 

  7. Huang G, Liu Z, van der Maaten L, Weinberger KQ (2018) Densely connected convolutional networks. arXiv:160806993 [cs]

  8. Yang M, Thung G. Classification of trash for recyclability status. 6

  9. Mao W-L, Chen W-C, Wang C-T, Lin Y-H (2021) Recycling waste classification using optimized convolutional neural network. Resour Conserv Recycl 164:105132. https://doi.org/10.1016/j.resconrec.2020.105132

    Article  Google Scholar 

  10. Chen Y, Sun J, Bi S et al (2021) Multi-objective solid waste classification and identification model based on transfer learning method. J Mater Cycles Waste Manag 23:2179–2191. https://doi.org/10.1007/s10163-021-01283-8

    Article  Google Scholar 

  11. Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. arXiv:13112524 [cs]

  12. Nowakowski P, Pamuła T (2020) Application of deep learning object classifier to improve e-waste collection planning. Waste Manage 109:1–9. https://doi.org/10.1016/j.wasman.2020.04.041

    Article  Google Scholar 

  13. Liang S, Gu Y (2021) A deep convolutional neural network to simultaneously localize and recognize waste types in images. Waste Manage 126:247–257. https://doi.org/10.1016/j.wasman.2021.03.017

    Article  Google Scholar 

  14. Lu Y, Yang B, Gao Y, Xu Z (2022) An automatic sorting system for electronic components detached from waste printed circuit boards. Waste Manage 137:1–8. https://doi.org/10.1016/j.wasman.2021.10.016

    Article  Google Scholar 

  15. Redmon J, Farhadi A (2018) YOLOv3: An incremental improvement. arXiv:180402767 [cs]

  16. Lu W, Chen J, Xue F (2022) Using computer vision to recognize composition of construction waste mixtures: a semantic segmentation approach. Resour Conserv Recycl 178:106022. https://doi.org/10.1016/j.resconrec.2021.106022

    Article  Google Scholar 

  17. Chen L-C, Zhu Y, Papandreou G, et al (2018) Encoder-decoder with atrous separable convolution for semantic image segmentation. arXiv:180202611 [cs]

  18. Ku Y, Yang J, Fang H et al (2021) Deep learning of grasping detection for a robot used in sorting construction and demolition waste. J Mater Cycles Waste Manag 23:84–95. https://doi.org/10.1007/s10163-020-01098-z

    Article  Google Scholar 

  19. Li J, Fang H, Fan L et al (2022) RGB-D fusion models for construction and demolition waste detection. Waste Manage 139:96–104. https://doi.org/10.1016/j.wasman.2021.12.021

    Article  Google Scholar 

  20. Drinking Waste Classification. https://www.kaggle.com/datasets/arkadiyhacks/drinking-waste-classification. Accessed 19 Sep 2022

  21. x670783915/huaweiyun_garbage_classify__learning: The topic was from huawei cloud garbage classification competition. https://github.com/x670783915/huaweiyun_garbage_classify__learning. Accessed 19 Sep 2022

  22. Proença PF, Simões P (2020) TACO: trash annotations in context for litter detection

  23. Wang T, Cai Y, Liang L, Ye D (2020) A multi-level approach to waste object segmentation. Sensors 20:3816. https://doi.org/10.3390/s20143816

    Article  Google Scholar 

  24. Li S, Kang X, Fang L et al (2017) Pixel-level image fusion: a survey of the state of the art. Inform Fusion 33:100–112. https://doi.org/10.1016/j.inffus.2016.05.004

    Article  Google Scholar 

  25. Blinn’S J. Cornpositing, Part I: Theory. 5

  26. Blhn J. Cornposting, Part 2: Practice. IEEE Computer Graphics and Applications 5

  27. Wu H, Zheng S, Zhang J, Huang K (2019) GP-GAN: towards realistic high-resolution image blending

  28. Zhang H, Zhang J, Perazzi F, et al (2020) Deep image compositing

  29. Zou Z, Shi Z, Guo Y, Ye J (2019) Object detection in 20 years: a survey. arXiv:190505055 [cs]

  30. Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: Advances in neural information processing systems. Curran Associates, Inc.

  31. Chen P-H, Lin C-J, Schölkopf B (2005) A tutorial on ν-support vector machines: ν-SUPPORT VECTOR MACHINES. Appl Stochastic Models Bus Ind 21:111–136. https://doi.org/10.1002/asmb.537

    Article  MathSciNet  Google Scholar 

  32. Ren S, He K, Girshick R, Sun J (2016) Faster R-CNN: towards real-time object detection with region proposal networks

  33. Redmon J, Divvala S, Girshick R, Farhadi A (2016) You only look once: unified, real-time object detection. arXiv:150602640 [cs]

  34. Liu W, Anguelov D, Erhan D, et al (2016) SSD: single shot multibox detector. arXiv:151202325 [cs] 9905:21–37. https://doi.org/10.1007/978-3-319-46448-0_2

  35. Vaswani A, Shazeer N, Parmar N, et al (2017) Attention is all you need

  36. ultralytics/yolov5: YOLOv5 in PyTorch > ONNX > CoreML > TFLite. https://github.com/ultralytics/yolov5. Accessed 19 Sep 2022

  37. Ito K, Xiong K (2000) Gaussian filters for nonlinear filtering problems. IEEE Trans Autom Control 45:910–927. https://doi.org/10.1109/9.855552

    Article  MathSciNet  Google Scholar 

  38. Canny J (1986) A computational approach to edge detection. IEEE Trans Pattern Anal Mach Intell PAMI-8:679–698. https://doi.org/10.1109/TPAMI.1986.4767851

  39. Otsu N (1979) A threshold selection method from gray-level histograms. IEEE Trans Syst Man Cybern 9:62–66. https://doi.org/10.1109/TSMC.1979.4310076

    Article  Google Scholar 

  40. Lin T-Y, Maire M, Belongie S, et al (2015) Microsoft COCO: common objects in context. arXiv:14050312 [cs]

  41. Everingham M, Van Gool L, Williams CKI et al (2010) The Pascal Visual Object Classes (VOC) challenge. Int J Comput Vis 88:303–338. https://doi.org/10.1007/s11263-009-0275-4

    Article  Google Scholar 

  42. Jiao L, Zhang F, Liu F et al (2019) A survey of deep learning-based object detection. IEEE Access 7:128837–128868. https://doi.org/10.1109/ACCESS.2019.2939201

    Article  Google Scholar 

  43. Elfwing S, Uchibe E, Doya K (2018) Sigmoid-weighted linear units for neural network function approximation in reinforcement learning. Neural Netw 107:3–11. https://doi.org/10.1016/j.neunet.2017.12.012

    Article  Google Scholar 

  44. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). pp 770–778

  45. Lin T-Y, Dollár P, Girshick R, et al (2017) Feature pyramid networks for object detection. arXiv:161203144 [cs]

  46. Liu S, Qi L, Qin H, et al (2018) Path aggregation network for instance segmentation. In: 2018 IEEE/CVF conference on computer vision and pattern recognition. pp 8759–8768

  47. He K, Zhang X, Ren S, Sun J (2015) Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE Trans Pattern Anal Mach Intell 37:1904–1916. https://doi.org/10.1109/TPAMI.2015.2389824

    Article  Google Scholar 

  48. Tian Z, Shen C, Chen H, He T (2019) FCOS: fully convolutional one-stage object detection. arXiv:190401355 [cs]

Download references

Acknowledgements

This study was supported by the National Nature Science Foundation of China [No.61801400], JSPS KAKENHI [No. JP18F18392] and Inner Mongolia Autonomous Region Science and Technology Plan Project [No. 2020GG0185].

Author information

Authors and Affiliations

Authors

Contributions

Yao Xiao: Conceptualization, software, validation, formal analysis, investigation, data curation, writing—original draft preparation, writing—review and editing, visualization. Bin Chen: Conceptualization, methodology, formal analysis, resources, data curation, writing—review and editing, supervision, project administration, funding acquisition. Changhao Feng: Conceptualization, methodology, resources, supervision, project administration, funding acquisition. Jiongming Qin: Writing—review and editing.

Cong Wang: Writing—review and editing.

Corresponding author

Correspondence to Bin Chen.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xiao, Y., Chen, B., Feng, C. et al. Recyclable solid waste detection based on image fusion and convolutional neural network. J Mater Cycles Waste Manag (2024). https://doi.org/10.1007/s10163-024-01949-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10163-024-01949-z

Keywords

Navigation