Skip to main content
Log in

Projected augmented reality assembly assistance system supporting multi-modal interaction

  • Application
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract

As an application paradigm of augmented reality in industry, the projected augmented reality (PAR) system is helpful in assembly, improving the ability of human–computer assembly cooperation. Precious research mainly focuses on visualization whereas the problem of how users naturally interact with the system has still not been discussed. In this paper, we developed a PAR system supporting multi-modal interaction (MMI), with which users could assemble and interact naturally. First, we tested the performance of MMI (speech, gesture, and finger click) against the baseline condition of single mode interaction. Meanwhile, we also compare the performance difference between gesture and finger click in bare-hand interaction. It is found that finger click shows a higher success rate, faster response time, lower false errors, less instruction memory, and better user experience than the gesture in bare-hand interaction. In specific, when the assembly must be completed but the hands are occupied, MMI can take advantage of user interaction modalities and reduce the interaction burden. The results prove that MMI shows better interactive naturalness and performance, which is more suitable for popularization and application in PAR assembly assistance systems. In conclusion, the developed PAR system can be a valid solution for intelligent manufacturing applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Availability of data and material

All the data are obtained by experiments and are authentic.

Code availability

The code we wrote can support the normal operation of the system.

Notes

  1. LightGuide: https://www.lightguidesys.com/

  2. Extend3D: https://www.extend3d.com/en/

  3. Kinect: http://www.k4w.cn/

  4. https://developer.vuforia.com/

References

  1. Bertram P, Birtel M, Quint F, Ruskowski M (2018) Intelligent manual working station through assistive systems. IFAC-PapersOnLine 51(11):170–175

    Article  Google Scholar 

  2. Xia L, Lu J, Zhang Z, Chen R, Wang S, Zhang H (2019) Development and application of parts assembly guidance system based on augmented reality. In: IEEE 4th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC) 1325–1330

  3. Uva AE, Fiorentino M, Gattullo M, Colaprico M, Ruvo MFD, Marino F, Trotta GF, Manghisi VM, Boccaccio A, Bevilacqua V (2016) Design of a projective AR workbench for manual working stations. In: International Conference on Augmented Reality, Virtual Reality and Computer Graphics 358–367

  4. Bertram P, Motsch W, Rübel P, Ruskowski M (2019) Intelligent material supply supporting assistive systems for manual working stations. Procedia Manuf 38:983–990

    Google Scholar 

  5. Tang YM, Chau KY, Kwok APK, Zhu T, Ma X (2022) A systematic review of immersive technology applications for medical practice and education - trends, application areas, recipients, teaching contents, evaluation methods, and performance. Educ Res Rev 35:100429

    Article  Google Scholar 

  6. Tang YM, Au KM, Lau HCW, Ho GTS, Wu CH (2020) Evaluating the effectiveness of learning design with mixed reality (MR) in higher education. Virtual Real 24(4):797–807

    Article  Google Scholar 

  7. Nee AYC, Ong SK, Chryssolouris G, Mourtzis D (2012) Augmented reality applications in design and manufacturing. CIRP Ann 61(2):657–679

    Article  Google Scholar 

  8. Henderson S, Feiner S (2011) Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE Trans Vis Comput Graph 17(10):1355–1368

    Article  Google Scholar 

  9. Zhou J, Zhou Y, Wang B, Zang J (2019) Human–cyber–physical systems (HCPSs) in the context of new-generation intelligent manufacturing. Engineering 5(4):624–636

    Article  Google Scholar 

  10. Webel S, Bockholt U, Engelke T, Gavish N, Olbrich M, Preusche C (2013) An augmented reality training platform for assembly and maintenance skills. Robot Auton Syst 61(4):398–403

    Article  Google Scholar 

  11. Joundi J, Conradie P, Van Den Bergh J, Saldien J (2019) Understanding and exploring operator needs in mixed model assembly. EISMS19, Workshop on Research and Practice Challenges for Engineering Interactive Systems while Integrating Multiple Stakeholders Viewpoints

  12. Simonetto M, Peron M, Fragapane G, Sgarbossa F (2021) Digital assembly assistance system in Industry 4.0 era: a case study with projected augmented reality. In: Advanced Manufacturing and Automation X 644–651

  13. Jayaweera N, Webb P, Johnson C (2010) Measurement assisted robotic assembly of fabricated aero-engine components. Assem Autom 30(1):56–65

    Article  Google Scholar 

  14. Alves J, Marques B, Oliveira M, Araújo T, Dias P, Santos BS (2019) Comparing spatial and mobile augmented reality for guiding assembling procedures with task validation. In: IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC) 1–6

  15. Büttner S, Sand O, Röcker C (2015) Extending the design space in industrial manufacturing through mobile projection. In: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct 1130–1133

  16. Simões B, De Amicis R, Barandiaran I, Posada J (2019) Cross reality to enhance worker cognition in industrial assembly operations. Int J Adv Manuf Technol 105(9):3965–3978

    Article  Google Scholar 

  17. Michalos G, Makris S, Spiliotopoulos J, Misios I, Tsarouchi P, Chryssolouris G (2014) ROBO-PARTNER: seamless human-robot cooperation for intelligent, flexible and safe operations in the assembly factories of the future. Procedia CIRP 23:71–76

    Article  Google Scholar 

  18. Funk M, Lischke L, Mayer S, Shirazi AS, Schmidt A (2018) Teach me how! Interactive assembly instructions using demonstration and in-situ projection. In Assistive Augmentation 49–73

  19. Wiedenmaier S, Oehme O, Schmidt L, Luczak H (2003) Augmented reality (AR) for assembly processes design and experimental evaluation. Int J Hum Comput Interact 16(3):497–514

    Article  Google Scholar 

  20. Raskar R, Low K-L (2001) Interacting with spatially augmented reality. In: Proceedings of the 1st international conference on Computer graphics, virtual reality and visualisation 101–108

  21. Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2017) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94(1–4):509–521

    Google Scholar 

  22. Zhang F, Dai G, Peng X (2016) A survey on human-computer interaction in virtual reality. SCIENTIA SINICA Informationis 46(12):1711–1736

    Article  Google Scholar 

  23. Georgel PF (2011) Is there a reality in industrial augmented reality? In: 10th IEEE International Symposium on Mixed and Augmented Reality 201–210

  24. Regenbrecht H, Baratoff G, Wilke W (2005) Augmented reality projects in the automotive and aerospace industries. IEEE Comput Graphics Appl 25(6):48–56

    Article  Google Scholar 

  25. Palmarini R, Erkoyuncu JA, Roy R, Torabmostaedi H (2018) A systematic review of augmented reality applications in maintenance. Robot Comput Integr Manuf 49:215–228

    Article  Google Scholar 

  26. Korn O, Schmidt A, Hörz T (2013) The potentials of in-situ-projection for augmented workplaces in production: a study with impaired persons. In: CHI '13 Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems 979–984

  27. Funk M, Mayer S, Schmidt A (2015) Using in-situ projection to support cognitively impaired workers at the workplace. In: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility 185–192

  28. Mayrhofer W, Rupprecht P, Schlund S (2019) One-fits-all vs. tailor-made: user-centered workstations for field assembly with an application in aircraft parts manufacturing. Procedia Manuf 39:149–157

    Google Scholar 

  29. Khuong BM, Kiyokawa K, Miller A, Viola JJL, Mashita T, Takemura H (2014) The effectiveness of an AR-based context-aware assembly support system in object assembly. In: IEEE Virtual Reality (VR) 57–62

  30. Zhu Z, Branzoi V, Wolverton M, Murray G, Kumar R (2014) AR-mentor: augmented reality based mentoring system. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 17–22

  31. Westerfield G, Mitrovic A, Billinghurst M (2014) Intelligent augmented reality training for motherboard assembly. Int J Artif Intell Educ 25(1):157–172

    Article  Google Scholar 

  32. Yin X, Fan X, Wang J, Liu R, Wang Q (2018) An automatic interaction method using part recognition based on deep network for augmented reality assembly guidance. In: ASME International Design Engineering Technical Conferences and Computers and Information in Engineering Conference 1–10

  33. Geiger C, Oppermann L, Reimann C (2003) 3D-registered interaction-surfaces in augmented reality space. In:IEEE International Augmented Reality Toolkit Workshop 5–13

  34. McFarlane DC, Wilder SM (2009) Interactive dirt: increasing mobile work performance with a wearable projector-camera system. In: Proceedings of the 11th international conference on Ubiquitous computing 205–214

  35. Harrison C, Tan D, Morris D (2010) Skinput: appropriating the body as an input surface. In: Proceedings of the 28th international conference on Human factors in computing systems - CHI '10 453–462

  36. Cortes G, Marchand E, Brincin G, Lécuyer A (2018) MoSART: mobile spatial augmented reality for 3D interaction with tangible objects. Front Robot AI 5(93)

  37. Doshi A, Smith RT, Thomas BH, Bouras C (2016) Use of projector based augmented reality to improve manual spot-welding precision and accuracy for automotive manufacturing. Int J Adv Manuf Technol 89(5–8):1279–1293

    Google Scholar 

  38. Mackamul EB, Esteves A (2018) A look at the effects of handheld and projected augmented-reality on a collaborative task. In: Proceedings of the Symposium on Spatial User Interaction 74–78

  39. Schmalstieg D, Hollerer T (2016) Augmented reality: principles and practice. Addison-Wesley Professional

    Google Scholar 

  40. Harrison C, Benko H, Wilson AD (2011) OmniTouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th annual ACM symposium on User interface software and technology 441–450

  41. Funk M, Shirazi AS, Mayer S, Lischke L, Schmidt A (2015) Pick from here! - an interactive mobile cart using in-situ projection for order picking. In: Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing - UbiComp '15 601–609

  42. Wang X, Ong SK, Nee AYC (2016) Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv Eng Inform 30(3):406–421

    Article  Google Scholar 

  43. Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22(11):1330–1334

    Article  Google Scholar 

  44. Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393

    Article  Google Scholar 

  45. Marin G, Dominio F, and Zanuttigh P (2014) Hand gesture recognition with leap motion and kinect devices. In:IEEE International Conference on Image Processing (ICIP) 1565–1569

  46. Wang P, Bai X, Billinghurst M, Zhang S, Han D, Sun M, Wang Z, Lv H, Han S (2020) Haptic Feedback Helps Me? A VR-SAR remote collaborative system with tangible interaction. Int J Hum Comput Interact 36(13):1242–1257

    Article  Google Scholar 

  47. Vogiatzidakis P, Koutsabasis P (2021) Mid-air gestures for manipulation of multiple targets in the physical space: comparing the usability of two interaction models. In: CHI Greece: 1st International Conference of the ACM Greek SIGCHI Chapter 1–9

  48. Brooke J (1996) SUS - a quick and dirty usability scale. Usability Evaluation in Industry 189(194):4–7

    Google Scholar 

  49. Laugwitz B, Held T, Schrepp M (2008) Construction and evaluation of a user experience questionnaire. In: HCI and Usability for Education and Work 63–76

  50. Vogiatzidakis P, Koutsabasis P (2022) ‘Address and command’: two-handed mid-air interactions with multiple home devices. Int J Hum Comput Stud 159:102755

    Article  Google Scholar 

  51. Cerney M, Vance J (2005) Gesture recognition in virtual environments: a review and framework for future development. Hum Comput Interact 1:28 (Technical Report ISU-HCI)

    Google Scholar 

  52. Bräuer P, Mazarakis A (2018) AR in order-picking–experimental evidence with Microsoft HoloLens. Mensch und Computer-Workshopband

  53. Avila JLO, Jimenez H, Marquez T, Muñoz C, Carrazco AM, Perdomo ME, Miselem D, Nolasco D (2020) Study Case: Teleoperated Voice Picking Robots prototype as a logistic solution in Honduras. In: 5th International Conference on Control and Robotics Engineering (ICCRE) 19–24

Download references

Funding

This work is partly supported by the National Key R&D Program of China (Grant No. 2019YFB1703800, 2021YFB1714900, 2021YFB1716200, 2020YFB1712503), the Programme of Introducing Talents of Discipline to Universities (111 Project), China (Grant No. B13044), and the Fundamental Research Funds for the Central Universities, NPU (Grant No. 3102020gxb003).

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study. The system conception was proposed by Shuxia Wang, Weiping He, and Jie Zhang. Material preparation and system development were performed by Jie Zhang, Jianghong Li, ZhiweiCao, and Bingzhao Wei. Data collection and analysis were performed by Jie Zhang. The first draft of the manuscript was written by Jie Zhang and Shuxia Wang. All authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding authors

Correspondence to Shuxia Wang or Weiping He.

Ethics declarations

Ethics approval

Not applicable.

Consent to participate

All authors read and approved the final manuscript.

Consent for publication

All authors agree to publish in The International Journal of Advanced Manufacturing Technology.

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (XLSX 27 KB)

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, J., Wang, S., He, W. et al. Projected augmented reality assembly assistance system supporting multi-modal interaction. Int J Adv Manuf Technol 123, 1353–1367 (2022). https://doi.org/10.1007/s00170-022-10113-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-022-10113-6

Keywords

Navigation