Off-line programming of an industrial robot in a virtual reality environment
Industrial robots need to be programmed quickly in order to be practically deployable in the production of small batches. In programming by teaching or demonstration, when most of the program’s content involves handling tasks, gesture recognition or other multimodal interfaces may be exploited. However, when the main task concerns manufacturing processing, typically tracing an edge in seam welding, deburring or cutting, then positioning and orienting the tool to considerable accuracy are required. This can only be achieved, if suitable tracking sensors are used. The current work employs a 6 degree-of-freedom magnetic sensor, but any other equivalent sensor could be used, too. The sensor is attached to a suitable hand-held teaching tool that is constructed in accordance with the real end-effector tool, enabling continuous tracking of its position and orientation interactively. A virtual reality platform records this stream of data in real time, making it possible to exploit it primarily in off-line programming of the robot. In this mode both the robot and the manufacturing cell are virtual, inverse kinematics allowing for calculation of joint coordinates from end-effector coordinates. Collision and clearance checks are also straightforwardly implemented. An edge-tracing application in 3D space was programmed following this paradigm. The resulting curves of the tool tip in the virtual and the real environment were close enough when compared by using photogrammetry. If required, the VR environment also allows for remote on-line programming, without any major modifications.
KeywordsΙndustrial robot Virtual reality Robot programming Trajectory teaching
Mr Nikolaos Melissas, Chief Technician in NTUA Manufacturing Technology Laboratory, is gratefully acknowledged for his contribution to tool constructions. Miss Margeaux Beaubet, of ENISE, France is gratefully acknowledged for her contribution in photogrammetry measurements.
- 2.Villani, V., Pini, F., Leali, F., Secchi, C.: Survey on human–robot collaboration in industrial settings: safety, intuitive interfaces and applications. Mechatronics (2018). https://doi.org/10.1016/j.mechatronics.2018.02.009 CrossRefGoogle Scholar
- 7.Alami, R., Gharbi, M., Vadant, B., et al.: On human-aware task and motion planning abilities for a teammate robot. In: Human–Robot Collaboration for Industrial Manufacturing Workshop, RSS 2014 (2014)Google Scholar
- 20.Berenson, D., Abbeel, P., Goldberg, K.: A robot path planning framework that learns from experience. In: 2012 IEEE International Conference on Robotics and Automation, pp. 3671–3678. IEEE (2012)Google Scholar
- 22.Zolkiewski, S., Pioskowik, D.: Robot Control and Online Programming by Human Gestures Using a Kinect Motion Sensor, pp. 593–604. Springer, Cham (2014)Google Scholar
- 23.Neto, P., Norberto Pires, J., Paulo Moreira, A.: High-level programming and control for industrial robotics: using a hand-held accelerometer-based input device for gesture and posture recognition. Ind. Robot. Int. J. 37, 137–147 (2010). https://doi.org/10.1108/01439911011018911 CrossRefGoogle Scholar
- 28.Qi, L., Zhang, D., Zhang, J., Li, J.: A lead-through robot programming approach using a 6-DOF wire-based motion tracking device. In: 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1773–1777. IEEE (2009)Google Scholar
- 29.Adept Technology Inc. V+ Language User's Guide, Version 12.1, Part # 00962-01130, Rev. A, September 1997Google Scholar
- 30.McCarthy, C., Callele, D.: Virtools User Guide (2006)Google Scholar
- 31.3DVIA Virtools: Behavior Libraries—VR Library/VR Publisher 2.6 (Virtools 126.96.36.199—5.0) User Guide (2009)Google Scholar
- 35.Ascension Technology Corporation: The Flock of Birds ® Installation and operation guide (2004)Google Scholar
- 38.Mourelatos, A., Nathanael, D., Gkikas, K., Psarakis, L.: Development and evaluation of a wearable motion tracking system for sensorimotor tasks in VR environments. In: Bagnara, S., Tartaglia, R., Albolino, S., Alexander, Th., Fujita, Y. (eds.) Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018). Volume V: Human Simulation and Virtual Environments, Work With Computing Systems (WWCS), Process Control, pp 181–188. Springer, Berlin (2019)Google Scholar