Skip to main content
Log in

Interactive Robot Trajectory Planning With Augmented Reality for Non-expert Users

  • Published:
International Journal of Control, Automation and Systems Aims and scope Submit manuscript


This paper presents a novel method for path selection by non-expert users in robot trajectory planning using augmented reality (AR). While AR has been used in robot control tasks, current approaches often require manual waypoint specification, limiting their effectiveness for non-expert users. In contrast, our study introduces an innovative AR-based method via a head-mounted display, designed to enhance human-robot interaction by making the process of selecting robotic paths more accessible to users without specialized expertise. The proposed method utilizes the RRT-Connect algorithm to automatically generate pathways from the initial to the goal position, offering choices of 1, 3, or 5 pathways, as well as 3 and 5 pathways with AR text guidance. This guidance provides contextual instructions within the AR environment, displaying the order of pathways from the fewest to the highest number of waypoints. Our findings demonstrate that optimizing the number of AR pathways can reduce user stress and improve operational skills. Path1 exhibited the fastest performance time but had the highest number of obstacle collisions. Methods with AR text guidance showed increased performance time compared to Path1. However, Path3 and Path5 achieved the best balance between performance time and collision avoidance. Qualitative analysis indicated that AR text displays demanded more effort from users. Path3 without AR text guidance was identified as the easiest method for operating the robot. Consequently, Path3 was deemed the most beneficial among the five methods. These results highlight the novelty of our method in enhancing the design of future human-robot interaction systems, focusing on improving efficiency, safety, and user experience for non-expert users using AR interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others


  1. G. Lugaresi and A. Matta, “Automated manufacturing system discovery and digital twin generation,” Journal of Manufacturing Systems, vol. 59, pp. 51–66, 2021.

    Article  Google Scholar 

  2. X. Zan, Z. Wu, C. Guo, and Z. Yu, “A pareto-based genetic algorithm for multi-objective scheduling of automated manufacturing systems,” Advances in Mechanical Engineering, vol. 12, no. 1, 1687814019885294, 2020.

    Article  Google Scholar 

  3. A. Bonci, P. D. Cen Cheng, M. Indri, G. Nabissi, and F. Sibona, “Human-robot perception in industrial environments: A survey,” Sensors, vol. 21, no. 5, 1571, 2021.

    Article  Google Scholar 

  4. E. Magrini, F. Ferraguti, A. J. Ronga, F. Pini, A. De Luca, and F. Leali, “Human-robot coexistence and interaction in open industrial cells,” Robotics and Computer-Integrated Manufacturing, vol. 61, 101846, 2020.

    Article  Google Scholar 

  5. R. Wang, Y. Miao, and K. E. Bekris, “Efficient and high-quality prehensile rearrangement in cluttered and confined spaces,” Proc. of International Conference on Robotics and Automation (ICRA), pp. 1968–1975, IEEE, 2022.

  6. H. Abouaïssa and S. Chouraqui, “On the control of robot manipulator: A model-free approach,” Journal of Computational Science, vol. 31, pp. 6–16, 2019.

    Article  MathSciNet  Google Scholar 

  7. L. Ivancic, D. Suša Vugec, and V. Bosilj Vukšić, “Robotic process automation: Systematic literature review,” Business Process Management: Blockchain and Central and Eastern Europe Forum: BPM 2019 Blockchain and CEE Forum, Vienna, Austria, September 1–6, 2019, Proceedings 17, pp. 280–295, Springer, 2019.

  8. D. Choi, H. R’bigui, and C. Cho, “Candidate digital tasks selection methodology for automation with robotic process automation,” Sustainability, vol. 13, no. 16, 8980, 2021.

    Article  Google Scholar 

  9. F. Dimeas, F. Fotiadis, D. Papageorgiou, A. Sidiropoulos, and Z. Doulgeri, “Towards progressive automation of repetitive tasks through physical human-robot interaction,” Proc. of Human Friendly Robotics: 10th International Workshop, pp. 151–163, Springer, 2019.

  10. C. Christoforou, N. Mavridis, E. L. Machado, and G. Spanoudis, “Android tele-operation through brain-computer interfacing: A real-world demo with non-expert users,” Proc. of the International Symposium on Robotics and Intelligent Sensors (IRIS), pp. 294–299, 2010.

  11. A. Hietanen, R. Pieters, M. Lanz, J. Latokartano, and J.-K. Kämäräinen, “AR-based interaction for human-robot collaborative manufacturing,” Robotics and Computer-Integrated Manufacturing, vol. 63, 101891, 2020.

    Article  Google Scholar 

  12. M. Dianatfar, J. Latokartano, and M. Lanz, “Review on existing vr/ar solutions in human-robot collaboration,” Procedia CIRP, vol. 97, pp. 407–411, 2021.

    Article  Google Scholar 

  13. J. Lee, T. Lim, and W. Kim, “Investigating the usability of collaborative robot control through hands-free operation using eye gaze and augmented reality,” arXiv preprint arXiv:2306.13072, 2023.

  14. S. M. Chacko, A. Granado, and V. Kapila, “An augmented reality framework for robotic tool-path teaching,” Procedia CIRP, vol. 93, pp. 1218–1223, 2020.

    Article  Google Scholar 

  15. M. Walker, H. Hedayati, J. Lee, and D. Szafir, “Communicating robot motion intent with augmented reality,” Proc. of the ACM/IEEE International Conference on Human-Robot Interaction, pp. 316–324, 2018.

  16. M. B. Luebbers, C. Brooks, M. J. Kim, D. Szafir, and B. Hayes, “Augmented reality interface for constrained learning from demonstration,” Proc. of the 2nd International Workshop on Virtual, Augmented and Mixed Reality for HRI (VAM-HRI), 2019.

  17. A. San Martín and J. Kildal, “Audio-visual ar to improve awareness of hazard zones around robots,” Proc. of Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–6, 2019.

  18. A. Moya, L. Bastida, P. Aguirrezabal, M. Pantano, and P. Abril-Jiménez, “Augmented reality for supporting workers in human-robot collaboration,” Multimodal Technologies and Interaction, vol. 7, no. 4, 40, 2023.

    Article  Google Scholar 

  19. M. Gu, A. Cosgun, W. P. Chan, T. Drummond, and E. Croft, “Seeing thru walls: Visualizing mobile robots in augmented reality,” Proc. of 30th IEEE International Conference on Robot & Human Interactive Communication (ROMAN), pp. 406–411, IEEE, 2021.

  20. J. Berg and S. Lu, “Review of interfaces for industrial human-robot interaction,” Current Robotics Reports, vol. 1, pp. 27–34, 2020.

    Article  Google Scholar 

  21. Z. Chen, Q. Guo, T. Li, and Y. Yan, “Output constrained control of lower limb exoskeleton based on knee motion probabilistic model with finite-time extended state observer,” IEEE/ASME Transactions on Mechatronics, 2023.

  22. Y. Yan, Z. Chen, C. Huang, L. Chen, and Q. Guo, “Human-exoskeleton coupling dynamics in the swing of lower limb,” Applied Mathematical Modelling, vol. 104, pp. 439–454, 2022.

    Article  MathSciNet  Google Scholar 

  23. H. Fang, S. Ong, and A. Nee, “Interactive robot trajectory planning and simulation using augmented reality,” Robotics and Computer-Integrated Manufacturing, vol. 28, no. 2, pp. 227–237, 2012.

    Article  Google Scholar 

  24. D. Foead, A. Ghifari, M. B. Kusuma, N. Hanafiah, and E. Gunawan, “A systematic literature review of A* pathfinding,” Procedia Computer Science, vol. 179, pp. 507–514, 2021.

    Article  Google Scholar 

  25. W. Reid, R. Fitch, A. H. Göktoǧgan, and S. Sukkarieh, “Motion planning for reconfigurable mobile robots using hierarchical fast marching trees,” Proc. of the Twelfth Workshop on the Algorithmic Foundations of Robotics, pp. 656–671, Springer, 2020.

  26. S. M. LaValle and J. J. Kuffner, “Rapidly-exploring random trees: Progress and prospects,” Algorithmic and Computational Robotics, pp. 303–307, 2001.

  27. A. Evlampev and M. Ostanin, “Obstacle avoidance for robotic manipulator using mixed reality glasses,” Proc. of 3rd School on Dynamics of Complex Networks and their Application in Intellectual Robotics (DCNAIR), pp. 46–48, IEEE, 2019.

  28. J. J. Kuffner and S. M. LaValle, “RRT-Connect: An efficient approach to single-query path planning,” Proc. of 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No. 00CH37065), vol. 2, pp. 995–1001, IEEE, 2000.

  29. K. Wei and B. Ren, “A method on dynamic path planning for robotic manipulator autonomous obstacle avoidance based on an improved RRT algorithm,” Sensors, vol. 18, no. 2, 571, 2018.

    Article  Google Scholar 

  30. M. Ostanin, S. Mikhel, A. Evlampiev, V. Skvortsova, and A. Klimchik, “Human-robot interaction for robotic manipulator programming in mixed reality,” Proc. of IEEE International Conference on Robotics and Automation (ICRA), pp. 2805–2811, IEEE, 2020.

  31. Microsoft, “Microsoft HoloLens 2.”, 2022. Accessed: 2023-08-10.

  32. Microsoft, “Microsoft QR code tracking overview.”, 2022. Accessed: 2023-09-16.

  33. M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, A. Y. Ng, et al., “ROS: An open-source robot operating system,” ICRA workshop on open source software, Kobe, Japan, vol. 3, p. 5, 2009.

    Google Scholar 

  34. E. J. Tronde, “Quantifying user experiences of physical products: A case study of combining nasa-tlx and product reaction cards for actionable insights,” 2021.

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Wansoo Kim.

Ethics declarations

The authors declare that there is no competing financial interest or personal relationship that could have appeared to influence the work reported in this paper.

Additional information

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This research was supported by a grant of the Korea Health Technology R&D Project through the Korea Health Industry Development Institute (KHIDI), funded by the Ministry of Health & Welfare, Korea (grant number: HI19C1081).

Joosun Lee received his M.S. degree in mechatronics engineering from Hanyang University, Korea, in 2022. He is currently pursuing a Ph.D. degree in mechatronics. He is also working as a Researcher at HumAn-Robot COllaboration (HARCO) laboratory, Korea. His current research interests include robotics, motion control, and human-robot interaction using the hololens and mobile robots.

Taeyhang Lim is currently pursuing an M.S. degree in interdisciplinary robot engineering systems at Hanyang University, Korea. She had received her B.S. degree from University of Toronto, Canada in 2020. Her current research interest includes human-robot interaction using the hololens.

Wansoo Kim is an assistant professor at Hanyang University ERICA, Korea, where he leads the HumAn-Robot COllaboration (HARCO) laboratory. He received his B.S. degree in mechanical engineering from Hanyang University, Korea in 2008 and a Ph.D. degree in mechanical engineering from Hanyang University, Korea in 2015 (Integrated M.S./Ph.D. program). He was with the Human-Robot Interfaces and Physical Interaction (HRI) Lab., Italian Institute of Technology in Genoa, Italy from 2016 to 2021. He was the winner of the Solution Award 2019 (Premio Innovazione Robotica at MECSPE2019), the winner of the KUKA Innovation Award 2018 and the winner of the HYU best Ph.D. paper award 2015. His research interests include physical human-robot interaction (pHRI), human-robot collaboration, shared control, ergonomics, human modelling, mobilemanipulator, and powered exoskeleton robot.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, J., Lim, T. & Kim, W. Interactive Robot Trajectory Planning With Augmented Reality for Non-expert Users. Int. J. Control Autom. Syst. (2024).

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: