Skip to main content
Log in

Edge computing-driven scene-aware intelligent augmented reality assembly

  • ORIGINAL ARTICLE
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract

Wearable augmented reality (AR) refers to the technology that the virtual models and annotations generated by computers can be superimposed on the real scene, and which can be utilized to help workers perform intuitive manual assembly operations. Nevertheless, most of the existing AR-aided assembly processes are carried out by triggering the step-by-step visual instruction manually, lacking scene-aware generation for the assembly assistance accordingly. To facilitate an intelligent AR-aided assembly, this paper proposed an edge computing (EC)-driven Scene-aware Intelligent AR Assembly (EC-SIARA) system, which can understand the assembly status on the fly with the lightweight wearable AR glass, and thus, worker-centered assistance is available to provide intuitive visual guidance accordingly. In beginning, the connection between the wearable AR glasses and EC system is established, which can alleviate the computing burden for the resource-constraint wearable AR glasses, and a high-efficiency scene awareness module during the manual assembly process is achieved. And then, based on context understanding of the current assembly status, the intuitive instructions can be triggered automatically, avoiding the operator’s cognitive load to launch the next AR instruction strictly. Finally, quantitative and qualitative experiments are carried out to evaluate the proposed EC-SIARA system, and experimental results demonstrate that the proposed method can realize a worker-center AR assembly process, improve the assembly efficiency, and reduce the occurrence of assembly errors effectively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Michalos G, Makris S, Papakostas N, Mourtzis D, Chryssolouris G (2010) Automotive assembly technologies review: challenges and outlook for a flexible and adaptive approach. CIRP J Manuf Sci Tec 2(2):81–91

    Article  Google Scholar 

  2. Xiao H, Duan Y, Zhang Z, Li M (2018) Detection and estimation of mental fatigue in manual assembly process of complex products. Assembly Autom 38(2):239–247

    Article  Google Scholar 

  3. Zhuang C, Liu J, Xiong H (2018) Digital twin-based smart production management and control framework for the complex product assembly shop-floor. Int J Adv Manuf Technol 96:1149–1163

    Article  Google Scholar 

  4. Servan J, Mas F, Menendez J, Rios J (2012) Using augmented reality in AIRBUS A400M shop floor assembly work instructions. 4th Manuf Engi Soc Int Conf 1431(1):633–640

  5. Simoes B, Amicis RD, Barandiaran I, Posada J (2019) Cross reality to enhance worker cognition in industrial assembly operations. Int J Adv Manuf Technol 105:3965–3978

    Article  Google Scholar 

  6. Alves JB, Marques B, Dias P, Santos BS (2021) Using augmented reality for industrial quality assurance: a shop floor user study. Int J Adv Manuf Technol 115:105–116

    Article  Google Scholar 

  7. Masood T, Egger J (2019) Augmented reality in support of Industry 4.0-implementation challenges and success factors. Robot Comp Integ Manuf 58:181–195

    Article  Google Scholar 

  8. Bernhardt S, Nicolau S, Soler L, Doignon C (2017) The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 37:66–90

    Article  Google Scholar 

  9. Tanzi L, Piazzolla P, Porpiglia F, Vezzetti E (2021) Real-time deep learning semantic segmentation during intra-operative surgery for 3D augmented reality assistance. Int J Comput Ass Rad 16:1435–1445

    Google Scholar 

  10. Moldovan F, Gligor A, Bataga T (2021) Structured integration and alignment algorithm: a tool for personalized surgical treatment of tibial plateau fractures. J Pers Med 11(3):190

    Article  Google Scholar 

  11. Wang ZB, Ong SK, Nee AYC (2013) Augmented reality aided interactive manual assembly design. Int J Adv Manuf Technol 69(5–8):1311–1321

    Article  Google Scholar 

  12. Wang X, Ong SK, Nee AYC (2016) A comprehensive survey of augmented reality assembly research. Adv Manuf 1:1–22

    Article  Google Scholar 

  13. Fang W, An Z (2020) A scalable wearable AR system for manual order picking based on warehouse floor-related navigation. Int J Adv Manuf Technol 109(7):2023–2037

    Article  Google Scholar 

  14. Webel S, Bockholt U, Engelke T, Gavish N, Olbrich M, Preusche C (2013) An augmented reality training platform for assembly and maintenance skills. Robot Auton Syst 61:398–403

    Article  Google Scholar 

  15. Ong SK, Nee AYC, Yew AWW, Thanigaivel NK (2020) AR-assisted robot welding programming Adv Manuf 8(1):40–48

    Google Scholar 

  16. Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2018) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94:509–521

    Article  Google Scholar 

  17. Radkowski R, Herrema J, Oliver J (2015) Augmented reality-based manual assembly support with visual features for different degrees of difficulty. Int J Hum Comput Int 31(5):337–349

    Article  Google Scholar 

  18. Sahu CK, Young C, Rai R (2020) Artificial intelligence (AI) in augmented reality (AR)-assisted manufacturing applications: a review. Int J Prod Res. https://doi.org/10.1080/00207543.2020.1859636

    Article  Google Scholar 

  19. Lai ZH, Tao WJ, Leu MC, Yin Z (2020) Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J Manuf Syst 55:69–81

    Article  Google Scholar 

  20. Park K, Kim M, Choi SH, Lee JY (2020) Deep learning-based smart task assistance in wearable augmented reality. Robot Comp Integ Manuf 63:101887

  21. Zhang L, Chen S, Dong H, Saddik AE (2018) Visualizing toronto city data with hololens: using augmented reality for a city model. IEEE Consum Electr M 7(3):73–80

    Article  Google Scholar 

  22. Vergel RS, Tena PM, Yrurzum SC, Cruz-Neira C (2020) A comparative evaluation of a virtual reality table and a hololens-based augmented reality system for anatomy training. IEEE T Hum-Mach Syst 50(4):337–348

    Article  Google Scholar 

  23. Miller J, Hoover M, Winer E (2020) Mitigation of the Microsoft HoloLens’ hardware limitations for a controlled product assembly process. Int J Adv Manuf Technol 109:1741–1754

    Article  Google Scholar 

  24. Danielsson O, Holm M, Syberfeldt A (2020) Augmented reality smart glasses in industrial assembly: current status and future challenges. J Ind Inf Integr 20(1):100175

  25. Makris S, Karagiannis P, Koukas S, Matthaiakis A (2016) Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann-Manuf Techn 65(1):61–64

    Article  Google Scholar 

  26. Zhu J, Ong SK, Nee AYC (2014) A context-aware augmented reality system to assist the maintenance operators. Int J Interact Des Manuf 8:293–304

    Article  Google Scholar 

  27. Mourtzis D, Zogopoulos V, Xanthi F (2019) Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling. Int J Adv Manuf Technol 105:3899–3910

    Article  Google Scholar 

  28. Deshpande A, Kim I (2018) The effects of augmented reality on improving spatial problem solving for object assembly. Adv Eng Inform 38:760–775

    Article  Google Scholar 

  29. Wang X, Ong SK, Nee AYC (2016) Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv Eng Inform 30:406–421

    Article  Google Scholar 

  30. Wang Z, Wang Y, Bai X, Huo X, Zhou J (2021) SHARIDEAS: a smart collaborative assembly platform based on augmented reality supporting assembly intention recognition. Int J Adv Manuf Technol 115:475–486

    Article  Google Scholar 

  31. Zhang X, Ming X, Liu Z, Yin D, Chen Z, Chang Y (2019) A reference framework and overall planning of industrial artificial intelligence (i-ai) for newapplication scenarios. Int J Adv Manuf Technol 101(9–12):2367–2389

    Article  Google Scholar 

  32. Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. IEEE Conf Comput Vis Pattern Recognit 580–587

  33. Girshick R (2015) Fast R-CNN. IEEE Int Conf Comput Vis 1440–1448

  34. Abdi L, Meddeb A (2017) Deep learning traffic sign detection, recognition and augmentation. Proc Symp Appl Comput 131–136

  35. Rao J, Qiao Y, Ren F, Wang J, Du Q (2017) A mobile outdoor augmented reality method combining deep learning object detection and spatial relationships for geovisualization. Sensors 17:1951

    Article  Google Scholar 

  36. Redmon J, Farhadi A (2017) YOLO9000: better, faster, stronger. IEEE Conf Comput Vis Pattern Recognit 6517–6525

  37. Ferraguti F, Pini F, Gale T, Messmer F, Storchi C, Leali F, Fantuzzi C (2019) Augmented reality based approach for on-line quality assessment of polished surfaces. Robot Comp Integ Manuf 59:158–167

    Article  Google Scholar 

  38. Ha K, Chen Z, Hu W, Richter W, Pillaiy P, Satyanarayanan M (2014) Towards wearable cognitive assistance. Proce Ann Int Conf Mob Syst Appl Serv 68–81

  39. Preum S M, Shu S, Ting J, Lin V, Williams R, Stankovic J, Alemzadeh H (2018) Towards a cognitive assistant system for emergency response. ACM/IEEE 9th Int Conf Cyber Phys Syst 347–348

  40. Li M, Zhang Z, Lei L, Wang X, Guo X (2020) Agricultural greenhouses detection in high-resolution satellite images based on convolutional neural networks: comparison of Faster R-CNN, YOLO v3 and SSD. Sensors 20:4938

    Article  Google Scholar 

  41. Zhao J, Li C, Xu Z, Jiao L, Zhao Z, Wang Z (2021) Detection of passenger flow on and off buses based on video images and YOLO algorithm. Multimed Tools Appl. https://doi.org/10.1007/s11042-021-10747-w

    Article  Google Scholar 

Download references

Acknowledgements

We want to thank the reviewers for their thoughtful suggestions for improving the earlier versions of this manuscript.

Funding

This work is partly supported by the National Natural Science Foundation of China (52105505), the Beijing Natural Science Foundation (3204050), the Open Project Program of State Key Laboratory of Virtual Reality Technology and Systems (VRLAB2020B05).

Author information

Authors and Affiliations

Authors

Contributions

Each author has substantially contributed to conducting the underlying research and drafting of this manuscript.

Corresponding author

Correspondence to Wei Fang.

Ethics declarations

Ethics approval

We confirm that this manuscript is original and has not been published, nor is it currently under consideration for publication elsewhere.

Consent to participate

Not applicable.

Consent for publication

Not applicable.

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fu, M., Fang, W., Gao, S. et al. Edge computing-driven scene-aware intelligent augmented reality assembly. Int J Adv Manuf Technol 119, 7369–7381 (2022). https://doi.org/10.1007/s00170-022-08758-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-022-08758-4

Keywords

Navigation