Advertisement

Automatic Control and Computer Sciences

, Volume 52, Issue 5, pp 392–401 | Cite as

Integration of Computervision and Artificial Intelligence Subsystems with Robot Operating System Based Motion Planning for Industrial Robots

  • Janis Arents
  • Ricards Cacurs
  • Modris Greitans
Article
  • 33 Downloads

Abstract

The paper proposes flexible system that is based on Robot Operating System framework for integration of 3D computer vision and artificial intelligence algorithms with industrial robots for automation of industrial tasks. The system provides flexibility of 3D computer vision hardware and industrial robot components, allowing to test different hardware with small software changes. The experimental system consisting of Kinect V2 RGB+Depth camera and Universal Robots UR5 robot was set up. In experimental setup the pick and place task was implemented where randomly organized two types of objects (tubes and cans) where picked from the container and sorted in two separate containers. Average full cycle time for the task was measured to be 19.675 s.

Keywords:

computer vision 3D vision industrial robots automation artificial intelligence robot operating system 

Notes

ACKNOWLEDGMENTS

The research leading to these results has received funding from the research project “Competency Centre of Latvian Electric and Optical Equipment Productive Industry” of EU Structural funds, contract no. 1.2.1.1/16/A/002 signed between LEO Competence Centre and Central Finance and Contracting Agency, Research no. 11 “The research on the development of computer vision techniques for the automation of industrial processes.”

REFERENCES

  1. 1.
    IFR, Executive Summary World Robotics 2017 Industrial Robots. https://ifr.org/downloads/press/Executive_ Summary_WR_2017_Industrial_Robots.pdf. Accessed July 6, 2018.Google Scholar
  2. 2.
    Gilchrist, A., Industry 4.0. The Industrial Internet of Things, Apress, 2016.Google Scholar
  3. 3.
    Purdy, M. and Daugherty, P., How AI boosts industry profits and innovation. https://www.accenture.com/us-en/ insight-ai-industry-growth. Accessed July 6, 2018.Google Scholar
  4. 4.
    Global 3D Camera Market Size, Share, Development, Growth and Demand Forecast to 2022 – Industry Insights by Technology (Time of Flight, Stereo Vision and Structured Light Imaging), by Type (Free Camera and Target Camera), and by Application (Professional Cameras, Smartphone, Tablets, Computer and Other). https://www. psmarketresearch.com/market-analysis/3d-camera-market, 2016. Accessed July 6, 2018.Google Scholar
  5. 5.
    HE Robotic Arms (Humid Environment). https://www.staubli.com/en/robotics/product-range/6-axis-scara-picker-industrial-robots/sensitive-environments/humid-environment/. Accessed July 6, 2018.Google Scholar
  6. 6.
    About ROS. http://www.ros.org/about-ros/. Accessed July 6, 2018.Google Scholar
  7. 7.
    Aitken, J.M., Veres, S.M., and Judge, M., Adaptation of system configuration under the robot operating system, IFAC Proc. Vol., 2014, vol. 47, no. 3, pp. 4484–4492.Google Scholar
  8. 8.
    Computer Vision Hardware and Software Market to Reach $48.6 Billion by 2022. https://www.tractica.com/ newsroom/press-releases/computer-vision-hardware-and-software-market-to-reach-48-6-billion-by-2022/. Accessed July 6, 2018.Google Scholar
  9. 9.
    Se, S. and Pears, N., Passive 3D Imaging, Springer, 2012.CrossRefGoogle Scholar
  10. 10.
    Hansard, M., Lee, S., Choi, O. and Horaud, R.P., Time-of-Flight Cameras: Principles, Methods and Applications, Springer Publishing Company, 2012.Google Scholar
  11. 11.
    Zhang, S., High-speed 3d shape measurement with structured light methods: A review, Opt. Lasers Eng., 2018, vol. 106, pp. 119–131.CrossRefGoogle Scholar
  12. 12.
    Lachat, E., Macher, H., Landes, T., and Grussenmeyer, P., Assessment and calibration of a RGB-D camera (Kinect v2 Sensor) towards a potential use for close-range 3D modeling, Remote Sens., 2015, vol. 7, no. 10, pp. 13070–13097. https://doi.org/10.3390/rs71013070.CrossRefGoogle Scholar
  13. 13.
    Fernandez, L., Avila, V., and Gonçalves, L., A generic approach for error estimation of depth data from (stereo and RGB-D) 3D sensors, 2017 (preprint).Google Scholar
  14. 14.
    Keselman, L., Woodfill, J.I., Grunnet-Jepse, A., and Bhowmik, A., Intel(R) RealSense(TM) stereoscopic depth cameras, 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, 2017.Google Scholar
  15. 15.
    OpenCV, About OpenCV. https://opencv.org/about.html. Accessed July 6, 2018.Google Scholar
  16. 16.
    Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., Huang, Z., Karpathy, A., Khosla, A., Bernstein, M., Berg, A.C., and Fei-Fei, L., ImageNet large scale visual recognition challenge, Int. J. Comput. Vision, 2015, vol. 115, no. 3, pp. 211–252.MathSciNetCrossRefGoogle Scholar
  17. 17.
    Krizhevsky, A., Sutskever, I., and Hinton, G.E., ImageNet classification with deep convolutional neural networks, in Advances in Neural Information Processing Systems, 2012, pp. 1097–1105.Google Scholar
  18. 18.
    Ros-Industrial. https://rosindustrial.org/about/description/. Accessed July 6, 2018.Google Scholar
  19. 19.
    Lentin, J., Mastering ROS for Robotics Programming: Design, Build and Simulate Complex Robots Using Robot Operating System and Master Its Out-of-the-Box Functionalities, Packt Publishing, 2015.Google Scholar
  20. 20.
    Concepts Moveit! http://moveit.ros.org/documentation/concepts/. Accessed July 6, 2018.Google Scholar
  21. 21.
    Hornung, A., Wurm, K.M., Bennewitz, M., Stachnis, C., and Burgard, W., OctoMap: An efficient probabilistic 3D mapping framework based on octrees, Auton. Rob., 2013, vol. 34, no. 3, pp. 189–206.CrossRefGoogle Scholar
  22. 22.
    Sucan, I.A., Moll, M., and Kavraki, L.E., The open motion planning library, IEEE Rob. Autom. Mag., 2012, vol. 19, no. 4, pp. 72–82.CrossRefGoogle Scholar
  23. 23.
    Kalakrishnan, M., Chitta, S., Theodorou, E., Pastor, P., and Schaal, S., Stomp: Stochastic trajectory optimization for motion planning, IEEE International Conference on Robotics and Automation, 2011.Google Scholar
  24. 24.
    Kavraki, L.E., Svestka, P., Latombe, J.-C., and Overmars, M.H., Probabilistic roadmaps for path planning in high-dimensional configuration spaces, IEEE Trans. Rob. Autom., 1996, vol. 12, no. 4, pp. 566–580.CrossRefGoogle Scholar
  25. 25.
    OMPL, Open motion planning library: A primer. http://ompl.kavrakilab.org/OMPL_Primer.pdf. Accessed July 6, 2018.Google Scholar
  26. 26.
    Pochyly, A., Kubela, T., Singule, V., and Cihak, P., Robotic vision for bin-picking applications of various objects applications, ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics), 2010.Google Scholar
  27. 27.
    Pochyly, A., Kubela, T., Kozak, M., and Cihak, P., 3D vision systems for industrial bin-picking applications, Proceedings of 15th International Conference MECHATRONIKA, 2012.Google Scholar
  28. 28.
    Pochyly, A., Kubela, T., Singule, V., and Cihak, P., Robotic bin-picking system based on a revolving vision system, 2017 19th International Conference on Electrical Drives and Power Electronics (EDPE), 2017.Google Scholar
  29. 29.
    Schyja, A., Hypki, A., and Kuhlenkötter, B., A modular and extensible framework for real and virtual bin-picking environments, 2012 IEEE International Conference on Robotics and Automation, 2012, pp. 5246–5251.Google Scholar
  30. 30.
    Tavares, P. and Sousa, A., Flexible pick and place architecture using ROS framework, 2015 10th Iberian Conference on Information Systems and Technologies (CISTI), 2015, pp. 1–6.Google Scholar
  31. 31.
    Buchholz, D., Winkelbach, S., and Wahl, F.M., Ransam for industrial bin-picking, ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics), 2010, pp. 1–6.Google Scholar
  32. 32.
    Kim, K., Cho, J., Pyo, J., Kang, S., and Kim, J., Dynamic object recognition using precise location detection and ANN for robot manipulator, 2017 International Conference on Control, Artificial Intelligence, Robotics Optimization (ICCAIRO), 2017, pp. 237–241.Google Scholar
  33. 33.
    Ur5 Technical Specifications. https://www.universal-robots.com/media/50588/ur5_en.pdf. Accessed July 6, 2018.Google Scholar
  34. 34.
    Andersen, T., Optimizing the Universal Robots ROS Driver. http://orbit.dtu.dk/files/117833332/Universal_ Robot_report.pdf. Accessed July 6, 2018.Google Scholar
  35. 35.
    Andersen, T.T., The new driver for the ur3/ur5/ur10 robot arms from universal robots. https://github.com/ ThomasTimm/ur_modern_driver. Accessed July 6, 2018.Google Scholar
  36. 36.
    ROS-Industrial Universal Robot Meta-Package. https://github.com/ros-industrial/universal_robot. Accessed July 6, 2018.Google Scholar

Copyright information

© Allerton Press, Inc. 2018

Authors and Affiliations

  1. 1.Institute of Electronics and Computer ScienceRigaLatvia

Personalised recommendations