Skip to main content
Log in

Design and fabrication of a high-efficiency defect inspection prototype for diary plastic cutlery based on machine-vision with improved deep leaning algorithm

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Plastic cutlery is widely used in dairy industry, with a huge production volume. Defects inevitably arise from factors such as raw material qualities, production processes, and environmental conditions. Traditionally, these defects are inspected manually, leading to high costs, high intensity, low accuracy, and reduced productivity. To address these issues, a high-efficiency defect detection prototype was designed and manufactured to accomplish the automatic sorting task of plastic cutlery. An improved deep learning model, YOLO-spoon, is proposed and integrated into the prototype. To tackle the characteristics of small product defects and packaging glare, Squeeze-and-Excitation attention module is employed to enhance the network's focus on defect features, thereby improving the accuracy of target defect detection. The C3 module of the backbone network is replaced with the attention module to reduce the network's complexity and enhance real-time performance. Experimental results demonstrate that, compared to the original algorithm, the average precision (mAP@0.5) of the proposed algorithm increased by 2.8%, with only 7.3ms detection time. The proposed algorithm also outperforms all the other mainstream algorithms by over 5.6%. Compared to the manual inspection, by using the proposed prototype, the three-year total cost reduced by 58.9%, the detection accuracy increased by 15.5%, and annual production capacity increased by 92.3%. The proposed prototype fulfills the requirement of high-efficiency automatic defect inspection for diary plastic cutlery.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Data availability

All data generated or analyzed during this study are included in this published article.

Code availability

The raw/processed data required to reproduce these findings cannot be shared at this time as the data also forms part of an ongoing study.

References

  1. Guan D, Yang N, Lai J et al (2021) Kinematic modeling and constraint analysis for robotic excavator operations in piling construction[J]. Autom Constr 126:103666. https://doi.org/10.1016/j.autcon.2021.103666

    Article  Google Scholar 

  2. Yun JP, Shin WC, Koo G et al (2020) Automated defect inspection system for metal surfaces based on deep learning and data augmentation[J]. J Manuf Syst 55:317–324. https://doi.org/10.1016/j.jmsy.2020.03.009

    Article  Google Scholar 

  3. Ravimal D, Kim H, Koh D et al (2020) Image-based inspection technique of a machined metal surface for an unmanned lapping process[J]. Int J Precision Eng Manuf-Green Technol 7:547–557. https://doi.org/10.1016/j.jmsy.2020.03.009

    Article  Google Scholar 

  4. Miao R, Shan Z, Zhou Q et al (2022) Real-time defect identification of narrow overlap welds and application based on convolutional neural networks[J]. J Manuf Syst 62:800–810. https://doi.org/10.1016/j.jmsy.2021.01.012

    Article  Google Scholar 

  5. Javaid M, Haleem A, Singh RP et al (2022) Exploring impact and features of machine vision for progressive industry 4.0 culture[J]. Sensors Int 3:100132. https://doi.org/10.1016/j.sintl.2021.100132

    Article  Google Scholar 

  6. Dargan S, Kumar M, Ayyagari MR et al (2020) A survey of deep learning and its applications: a new paradigm to machine learning[J]. Arch Computational Methods Eng 27:1071–1092. https://doi.org/10.1007/s11831-019-09344-w

    Article  MathSciNet  Google Scholar 

  7. Zheng X, Zheng S, Kong Y et al (2021) Recent advances in surface defect inspection of industrial products using deep learning techniques[J]. Int J Adv Manuf Technol 113:35–58. https://doi.org/10.1007/s00170-021-06592-8

    Article  Google Scholar 

  8. Poxi H, Chen W, Gao J. (2023) Overview of Surface Defect Detection Methods Based on Deep Learning[M]//Advanced Manufacturing and Automation XII. Singapore: Springer Nature Singapore. 123–128. https://doi.org/10.1007/978-981-19-9338-1_16

  9. Jha SB, Babiceanu RF (2023) Deep CNN-based visual defect detection: Survey of current literature[J]. Comput Ind 148:103911. https://doi.org/10.1016/j.compind.2023.103911

    Article  Google Scholar 

  10. Batzolis E, Vrochidou E, Papakostas G A. (2023) Machine learning in embedded systems: Limitations, solutions and future challenges[C]//2023 IEEE 13th annual computing and communication workshop and conference (CCWC). IEEE: 0345–0350. https://doi.org/10.1109/CCWC57344.2023.10099348

  11. Guan D, Cong X, Li J et al (2022) Theoretical modeling and optimal matching on the damping property of mechatronic shock absorber with low speed and heavy load capacity[J]. J Sound Vib 535:117113. https://doi.org/10.1016/j.jsv.2022.117113

    Article  Google Scholar 

  12. Guan D, Jing L, Gong J et al (2018) Prediction of sound absorption property of metal rubber using general regression neural network[J]. Noise Control Eng J 66(5):424–431. https://doi.org/10.3397/1/376636

    Article  Google Scholar 

  13. Guan D, Wu JH, Wu J et al (2015) Acoustic performance of aluminum foams with semiopen cells[J]. Appl Acoust 87:103–108. https://doi.org/10.1016/j.apacoust.2014.06.016

    Article  Google Scholar 

  14. Girshick R, Donahue J, Darrell T et al (2014) Rich feature hierarchies for accurate object detection and semantic segmentation[C]. Proc IEEE Conf Comp Vis Pattern Recog 580–587. https://doi.org/10.48550/arXiv.1311.2524

  15. Girshick R (2015) Fast r-cnn[C]. Proc IEEE Int Conf Comp Vis 1440–1448. https://doi.org/10.48550/arXiv.1504.08083

  16. Ren S, He K, Girshick R et al (2015) Faster r-cnn: towards real-time object detection with region proposal networks[J]. Adv Neural Inf Processing Syst 28. https://doi.org/10.48550/arXiv.1506.01497

  17. He K, Gkioxari G, Dollár P et al (2017) Mask r-cnn[C]. Proc IEEE Int Conf Comp Vis 2961–2969. https://doi.org/10.48550/arXiv.1703.06870

  18. Xu X, Zhao M, Shi P et al (2022) Crack detection and comparison study based on faster R-CNN and mask R-CNN[J]. Sensors 22(3):1215. https://doi.org/10.3390/s22031215

    Article  Google Scholar 

  19. Kuo JK, Wu JJ, Huang PH et al (2022) Inspection of sandblasting defect in investment castings by deep convolutional neural network[J]. Int J Adv Manuf Technol 120(3–4):2457–2468. https://doi.org/10.1007/s00170-022-08841-w

    Article  Google Scholar 

  20. Nguyen TP, Choi S, Park SJ et al (2021) Inspecting method for defective casting products with convolutional neural network (CNN)[J]. Int J Precision Eng Manuf-Green Technol 8:583–594. https://doi.org/10.1007/s40684-020-00197-4

    Article  Google Scholar 

  21. Liu W, Anguelov D, Erhan D, et al. (2016) Ssd: Single shot multibox detector[C]//Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, October 11–14, 2016, Proceedings, Part I 14. Springer International Publishing, 21-37https://doi.org/10.1007/978-3-319-46448-0_2

  22. Redmon J, Divvala S, Girshick R et al (2016) You only look once: unified, real-time object detection[C]. Proc IEEE Conf Comp Vis Pattern Recog 779–788. https://doi.org/10.48550/arXiv.1506.02640

  23. Huang JT, Ting CH (2022) Deep learning object detection applied to defect recognition of memory modules[J]. Int J Adv Manuf Technol 121(11–12):8433–8445. https://doi.org/10.1007/s00170-022-09716-w

    Article  Google Scholar 

  24. Chen YW, Shiu JM (2022) An implementation of YOLO-family algorithms in classifying the product quality for the acrylonitrile butadiene styrene metallization[J]. Int J Adv Manuf Technol 119(11–12):8257–8269. https://doi.org/10.1007/s00170-022-08676-5

    Article  Google Scholar 

  25. Dai W, Li D, Tang D et al (2021) Deep learning assisted vision inspection of resistance spot welds[J]. J Manuf Process 62:262–274. https://doi.org/10.1016/j.jmapro.2020.12.015

    Article  Google Scholar 

  26. Zhang M, Yin L (2022) Solar cell surface defect detection based on improved YOLO v5[J]. IEEE Access 10:80804–80815. https://doi.org/10.1109/ACCESS.2022.3195901

    Article  Google Scholar 

  27. Lin T Y, Maire M, Belongie S, et al. (2014) Microsoft coco: Common objects in context[C]//Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland, September 6-12, 2014, Proceedings, Part V 13. Springer International Publishing, 740-755https://doi.org/10.1007/978-3-319-10602-1_48

  28. Tang H, Yuan C, Li Z et al (2022) Learning attention-guided pyramidal features for few-shot fine-grained recognition[J]. Pattern Recogn 130:108792. https://doi.org/10.1016/j.patcog.2022.108792

    Article  Google Scholar 

  29. Hu J, Shen L, Sun G (2018) Squeeze-and-excitation networks[C]. Proc IEEE Conf Comput Vis Pattern Recog 7132–7141. https://doi.org/10.48550/arXiv.1709.01507

  30. Woo S, Park J, Lee J Y et al (2018) Cbam: convolutional block attention module[C]. Proc Eur Conf Comput Vis (ECCV) 3–19. https://doi.org/10.48550/arXiv.1807.06521

  31. Wang Q, Wu B, Zhu P et al (2020) ECA-Net: efficient channel attention for deep convolutional neural networks[C]. Proc IEEE/CVF Conf Comput Vis Pattern Recog 11534–11542. https://doi.org/10.48550/arXiv.1910.03151

Download references

Funding

Jiangsu Key R&D Program BE2020082-1.

Author information

Authors and Affiliations

Authors

Contributions

Jian Yang: conceptualization, writing. Yu Qin: methodology, investigation, software, writing. Zhida Zhu: investigation. Xiaobin Xu: validation. Dong Guan: reviewing and editing.

Corresponding authors

Correspondence to Jian Yang or Dong Guan.

Ethics declarations

Ethics approval

Not applicable.

Consent to participate

Not applicable.

Consent for publication

The authors give their consent for publication.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (MP4 3355 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, J., Qin, Y., Zhu, Z. et al. Design and fabrication of a high-efficiency defect inspection prototype for diary plastic cutlery based on machine-vision with improved deep leaning algorithm. Multimed Tools Appl (2024). https://doi.org/10.1007/s11042-024-19395-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11042-024-19395-2

Keywords

Navigation