Skip to main content
Log in

A novel object detection method to facilitate the recycling of waste small electrical and electronic equipment

  • ORIGINAL ARTICLE
  • Published:
Journal of Material Cycles and Waste Management Aims and scope Submit manuscript

Abstract

Whether Waste Electrical and Electronic Equipment (WEEE) can be effectively sorted and recycled will affect the sustainable development of human society. However, Waste Small Electrical and Electronic Equipment (WSEEE) has often been neglected to meet the needs of the full range of WEEE governance. In this study, a YOLO-wseee classification model based on the YOLO (You Only Look Once)v5 framework, combined with a deepened-efficient layer aggregation networks (ELAN) structure (D-ELAN) and using efficient intersection over union (EIOU) loss, was designed to identify WSEEE. To make the model more suitable for detecting the real WSEEE state, a dataset called WASTE-SEEE was created, containing images of WSEEE in a variety of images are captured in real time. The experimental results show that the Precision and mAP@0.5 of the YOLO-wseee model are 98.24% and 99.32%, respectively, and the FOLPs of the model are only 23.9, and its model comprehensive index is significantly better than other classic models of YOLOv5. This method can help humans to detect WSEEE under real conditions more easily, thus helping to improve the problems facing the recycling and reuse of WSEEE, helping humans to improve efficiency, save resources and manage the environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Forti V, Balde CP, Kuehr R, Bel G (2020) The global e-waste monitor 2020: quantities, flows and the circular economy potential, pp 13–15

  2. Liu X, Tanaka M, Matsui Y (2009) Economic evaluation of optional recycling processes for waste electronic home appliances. J Clean Prod 17(1):53–60

    Article  Google Scholar 

  3. Chancerel P, Rotter S (2009) Recycling-oriented characterization of small waste electrical and electronic equipment. Waste Manage 29(8):2336–2352

    Article  Google Scholar 

  4. Sterkens W, Diaz-Romero D, Goedemé T, Dewulf W, Peeters JR (2021) Detection and recognition of batteries on X-Ray images of waste electrical and electronic equipment using deep learning. Resour Conserv Recycl 168:105246

    Article  Google Scholar 

  5. Nowakowski P, Pamuła T (2020) Application of deep learning object classifier to improve e-waste collection planning. Waste Manage 109:1–9

    Article  Google Scholar 

  6. Foo G, Kara S, Pagnucco M (2021) Screw detection for disassembly of electronic waste using reasoning and re-training of a deep learning model. Procedia CIRP 98:666–671

    Article  Google Scholar 

  7. Sengupta S, Basak S, Saikia P, Paul S, Tsalavoutis V, Atiah F, Peters A (2020) A review of deep learning with special emphasis on architectures, applications and recent trends. Knowl-Based Syst 194:105596

    Article  Google Scholar 

  8. Mu R, Zeng X (2019) A review of deep learning research. KSII Trans Internet Inform Syst (TIIS) 13(4):1738–1764

    Google Scholar 

  9. LeCun Y, Haffner P, Bottou L, Bengio Y (1999) Shape, contour and grouping in computer vision. Lecture Notes in Computer Science, vol 1681, pp 319–345

  10. Krizhevsky A, Sutskever I, Hinton GE (2017) Imagenet classification with deep convolutional neural networks. Commun ACM 60(6):84–90

    Article  Google Scholar 

  11. Simonyan K, Zisserman A, 2014. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.

  12. He K, Zhang X, Ren S, Sun J 2015 Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition,770–778.

  13. LeCun Y, Boser B, Denker JS, Henderson D, Howard RE, Hubbard W, Jackel LD (1989) Backpropagation applied to handwritten zip code recognition. Neural Comput 1(4):541–551

    Article  Google Scholar 

  14. Wang Z, Liu D, Yang J, Han W, Huang T 2015. Deep networks for image super-resolution with sparse prior. In Proceedings of the IEEE international conference on computer vision, 370–378.

  15. Yang J, Yu K, Gong Y, Huang T, 2009. Linear spatial pyramid matching using sparse coding for image classification. In 2009 IEEE conference on computer vision and pattern recognition, IEEE. 1794–1801

  16. Ren S, He K, Girshick R, Sun J (2015) Faster r-cnn: towards real-time object detection with region proposal networks. Advances in neural information processing systems, vol 28, pp 1–12

  17. Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu CY, Berg AC (2016) Ssd: single shot multibox detector. European conference on computer vision. Springer, Cham, pp 21–37

    Google Scholar 

  18. Redmon J, Divvala S, Girshick R, Farhadi A, 2016. You only look once: unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition ,779–788.

  19. Redmon J, Farhadi A 2017 YOLO9000: better, faster, stronger. In Proceedings of the IEEE conference on computer vision and pattern recognition ,7263–7271.

  20. Redmon J, Farhadi A 2018. Yolov3: an incremental improvement. arXiv preprint arXiv:1804.02767.

  21. Zhibin Y, Hong C (2019) Research on household appliances recognition method based on data screening of deep learning. IFAC-PapersOnLine 52(24):140–144

    Article  Google Scholar 

  22. Karbasi H, Sanderson A, Sharifi A, Wilson C 2018 Robotic sorting of shredded e-waste: utilizing deep learning. In proceedings on the international conference on artificial intelligence (ICAI) ,119–123.

  23. Lu Y, Yang B, Gao Y, Xu Z (2022) An automatic sorting system for electronic components detached from waste printed circuit boards. Waste Manage 137:1–8

    Article  Google Scholar 

  24. Dollár P, Singh M, Girshick R. 2021. Fast and accurate model scaling. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 924–932.

  25. Zhang YF, Ren W, Zhang Z, Jia Z, Wang L, Tan T (2022) Focal and efficient IOU loss for accurate bounding box regression. Neurocomputing 506:146–157

    Article  Google Scholar 

  26. Wang Q, Cheng M, Huang S, Cai Z, Zhang J, Yuan H (2022) A deep learning approach incorporating YOLO v5 and attention mechanisms for field real-time detection of the invasive weed Solanum rostratum Dunal seedlings. Comput Electron Agric 199:107194

    Article  Google Scholar 

Download references

Acknowledgements

Thanks to Jiangsu Beier Machinery Co. for providing equipment and sample support. This work was sponsored by the Qing Lan Project of the Higher Education Institutions of Jiangsu Province and the 2022 Jiangsu Province Science and Technology Program Special Funds (International Science and Technology Cooperation) (BZ2022029).

Author information

Authors and Affiliations

Authors

Contributions

QW: validation, investigation, data curation. NW: validation, formal analysis, visualization, writing—original draft. HF: data curation, supervision. DH: acquisition of data, resources.

Corresponding author

Correspondence to Ning Wang.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 2420 KB)

Supplementary file2 (DOCX 22 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, Q., Wang, N., Fang, H. et al. A novel object detection method to facilitate the recycling of waste small electrical and electronic equipment. J Mater Cycles Waste Manag 25, 2861–2869 (2023). https://doi.org/10.1007/s10163-023-01718-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10163-023-01718-4

Keywords

Navigation