Abstract
Wearable augmented reality (AR) refers to the technology that the virtual models and annotations generated by computers can be superimposed on the real scene, and which can be utilized to help workers perform intuitive manual assembly operations. Nevertheless, most of the existing AR-aided assembly processes are carried out by triggering the step-by-step visual instruction manually, lacking scene-aware generation for the assembly assistance accordingly. To facilitate an intelligent AR-aided assembly, this paper proposed an edge computing (EC)-driven Scene-aware Intelligent AR Assembly (EC-SIARA) system, which can understand the assembly status on the fly with the lightweight wearable AR glass, and thus, worker-centered assistance is available to provide intuitive visual guidance accordingly. In beginning, the connection between the wearable AR glasses and EC system is established, which can alleviate the computing burden for the resource-constraint wearable AR glasses, and a high-efficiency scene awareness module during the manual assembly process is achieved. And then, based on context understanding of the current assembly status, the intuitive instructions can be triggered automatically, avoiding the operator’s cognitive load to launch the next AR instruction strictly. Finally, quantitative and qualitative experiments are carried out to evaluate the proposed EC-SIARA system, and experimental results demonstrate that the proposed method can realize a worker-center AR assembly process, improve the assembly efficiency, and reduce the occurrence of assembly errors effectively.
Similar content being viewed by others
References
Michalos G, Makris S, Papakostas N, Mourtzis D, Chryssolouris G (2010) Automotive assembly technologies review: challenges and outlook for a flexible and adaptive approach. CIRP J Manuf Sci Tec 2(2):81–91
Xiao H, Duan Y, Zhang Z, Li M (2018) Detection and estimation of mental fatigue in manual assembly process of complex products. Assembly Autom 38(2):239–247
Zhuang C, Liu J, Xiong H (2018) Digital twin-based smart production management and control framework for the complex product assembly shop-floor. Int J Adv Manuf Technol 96:1149–1163
Servan J, Mas F, Menendez J, Rios J (2012) Using augmented reality in AIRBUS A400M shop floor assembly work instructions. 4th Manuf Engi Soc Int Conf 1431(1):633–640
Simoes B, Amicis RD, Barandiaran I, Posada J (2019) Cross reality to enhance worker cognition in industrial assembly operations. Int J Adv Manuf Technol 105:3965–3978
Alves JB, Marques B, Dias P, Santos BS (2021) Using augmented reality for industrial quality assurance: a shop floor user study. Int J Adv Manuf Technol 115:105–116
Masood T, Egger J (2019) Augmented reality in support of Industry 4.0-implementation challenges and success factors. Robot Comp Integ Manuf 58:181–195
Bernhardt S, Nicolau S, Soler L, Doignon C (2017) The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 37:66–90
Tanzi L, Piazzolla P, Porpiglia F, Vezzetti E (2021) Real-time deep learning semantic segmentation during intra-operative surgery for 3D augmented reality assistance. Int J Comput Ass Rad 16:1435–1445
Moldovan F, Gligor A, Bataga T (2021) Structured integration and alignment algorithm: a tool for personalized surgical treatment of tibial plateau fractures. J Pers Med 11(3):190
Wang ZB, Ong SK, Nee AYC (2013) Augmented reality aided interactive manual assembly design. Int J Adv Manuf Technol 69(5–8):1311–1321
Wang X, Ong SK, Nee AYC (2016) A comprehensive survey of augmented reality assembly research. Adv Manuf 1:1–22
Fang W, An Z (2020) A scalable wearable AR system for manual order picking based on warehouse floor-related navigation. Int J Adv Manuf Technol 109(7):2023–2037
Webel S, Bockholt U, Engelke T, Gavish N, Olbrich M, Preusche C (2013) An augmented reality training platform for assembly and maintenance skills. Robot Auton Syst 61:398–403
Ong SK, Nee AYC, Yew AWW, Thanigaivel NK (2020) AR-assisted robot welding programming Adv Manuf 8(1):40–48
Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2018) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94:509–521
Radkowski R, Herrema J, Oliver J (2015) Augmented reality-based manual assembly support with visual features for different degrees of difficulty. Int J Hum Comput Int 31(5):337–349
Sahu CK, Young C, Rai R (2020) Artificial intelligence (AI) in augmented reality (AR)-assisted manufacturing applications: a review. Int J Prod Res. https://doi.org/10.1080/00207543.2020.1859636
Lai ZH, Tao WJ, Leu MC, Yin Z (2020) Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J Manuf Syst 55:69–81
Park K, Kim M, Choi SH, Lee JY (2020) Deep learning-based smart task assistance in wearable augmented reality. Robot Comp Integ Manuf 63:101887
Zhang L, Chen S, Dong H, Saddik AE (2018) Visualizing toronto city data with hololens: using augmented reality for a city model. IEEE Consum Electr M 7(3):73–80
Vergel RS, Tena PM, Yrurzum SC, Cruz-Neira C (2020) A comparative evaluation of a virtual reality table and a hololens-based augmented reality system for anatomy training. IEEE T Hum-Mach Syst 50(4):337–348
Miller J, Hoover M, Winer E (2020) Mitigation of the Microsoft HoloLens’ hardware limitations for a controlled product assembly process. Int J Adv Manuf Technol 109:1741–1754
Danielsson O, Holm M, Syberfeldt A (2020) Augmented reality smart glasses in industrial assembly: current status and future challenges. J Ind Inf Integr 20(1):100175
Makris S, Karagiannis P, Koukas S, Matthaiakis A (2016) Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann-Manuf Techn 65(1):61–64
Zhu J, Ong SK, Nee AYC (2014) A context-aware augmented reality system to assist the maintenance operators. Int J Interact Des Manuf 8:293–304
Mourtzis D, Zogopoulos V, Xanthi F (2019) Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling. Int J Adv Manuf Technol 105:3899–3910
Deshpande A, Kim I (2018) The effects of augmented reality on improving spatial problem solving for object assembly. Adv Eng Inform 38:760–775
Wang X, Ong SK, Nee AYC (2016) Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv Eng Inform 30:406–421
Wang Z, Wang Y, Bai X, Huo X, Zhou J (2021) SHARIDEAS: a smart collaborative assembly platform based on augmented reality supporting assembly intention recognition. Int J Adv Manuf Technol 115:475–486
Zhang X, Ming X, Liu Z, Yin D, Chen Z, Chang Y (2019) A reference framework and overall planning of industrial artificial intelligence (i-ai) for newapplication scenarios. Int J Adv Manuf Technol 101(9–12):2367–2389
Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. IEEE Conf Comput Vis Pattern Recognit 580–587
Girshick R (2015) Fast R-CNN. IEEE Int Conf Comput Vis 1440–1448
Abdi L, Meddeb A (2017) Deep learning traffic sign detection, recognition and augmentation. Proc Symp Appl Comput 131–136
Rao J, Qiao Y, Ren F, Wang J, Du Q (2017) A mobile outdoor augmented reality method combining deep learning object detection and spatial relationships for geovisualization. Sensors 17:1951
Redmon J, Farhadi A (2017) YOLO9000: better, faster, stronger. IEEE Conf Comput Vis Pattern Recognit 6517–6525
Ferraguti F, Pini F, Gale T, Messmer F, Storchi C, Leali F, Fantuzzi C (2019) Augmented reality based approach for on-line quality assessment of polished surfaces. Robot Comp Integ Manuf 59:158–167
Ha K, Chen Z, Hu W, Richter W, Pillaiy P, Satyanarayanan M (2014) Towards wearable cognitive assistance. Proce Ann Int Conf Mob Syst Appl Serv 68–81
Preum S M, Shu S, Ting J, Lin V, Williams R, Stankovic J, Alemzadeh H (2018) Towards a cognitive assistant system for emergency response. ACM/IEEE 9th Int Conf Cyber Phys Syst 347–348
Li M, Zhang Z, Lei L, Wang X, Guo X (2020) Agricultural greenhouses detection in high-resolution satellite images based on convolutional neural networks: comparison of Faster R-CNN, YOLO v3 and SSD. Sensors 20:4938
Zhao J, Li C, Xu Z, Jiao L, Zhao Z, Wang Z (2021) Detection of passenger flow on and off buses based on video images and YOLO algorithm. Multimed Tools Appl. https://doi.org/10.1007/s11042-021-10747-w
Acknowledgements
We want to thank the reviewers for their thoughtful suggestions for improving the earlier versions of this manuscript.
Funding
This work is partly supported by the National Natural Science Foundation of China (52105505), the Beijing Natural Science Foundation (3204050), the Open Project Program of State Key Laboratory of Virtual Reality Technology and Systems (VRLAB2020B05).
Author information
Authors and Affiliations
Contributions
Each author has substantially contributed to conducting the underlying research and drafting of this manuscript.
Corresponding author
Ethics declarations
Ethics approval
We confirm that this manuscript is original and has not been published, nor is it currently under consideration for publication elsewhere.
Consent to participate
Not applicable.
Consent for publication
Not applicable.
Conflict of interest
The authors declare no competing interests.
Additional information
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Fu, M., Fang, W., Gao, S. et al. Edge computing-driven scene-aware intelligent augmented reality assembly. Int J Adv Manuf Technol 119, 7369–7381 (2022). https://doi.org/10.1007/s00170-022-08758-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00170-022-08758-4