Advertisement

A real-time automated sorting of robotic vision system based on the interactive design approach

  • Wisam T. Abbood
  • Oday I. AbdullahEmail author
  • Enas A. Khalid
Original Paper
  • 16 Downloads

Abstract

This research paper presents the proposes a robotic vision system to distinguish the color for the object and his position coordinate, and then sort the object (product) on the right branch conveyor belt according to color in real-time. The system was built based on the HVS mode algorithm for sorting product based on color. Furthermore, the system can be distinguished the object shape and then find his position to picking the object shape and putting on the right branch conveyor belt. The assumptions for the object shape were based on the shape properties, centroid algorithm, and border extraction. Both the object detection and the contour coordinate extraction methods are implemented using a series of image processing techniques. The main goal is met by sorting the object depends on the color feature from a gathering of objects. The robot movement (open and close griper, move up and down the arm, and move to the left and right) controlled by a microcontroller that controls the movement to the right branch conveyor belt. When the color or the object is detected, the microcontroller will initiate the actions of the robot. It was found that the accuracy of results based on the approach that developed in this paper which is 92% for shape sorting and 97% for colors sorting objects.

Keywords

Robotic vision system Real-time sorting system Vision system machine Automated vision system 

Notes

References

  1. 1.
    Nayar, S.K.: Robotic vision system. U.S. Patent 4,893,183, issued January 9 (1990)Google Scholar
  2. 2.
    Pochyly, A., Kubela, T., Kozak, M., Cihak, P.: Robotic vision for bin-picking applications of various objects. In: Robotics (ISR), 2010 41st International Symposium on and 2010 6th German Conference on Robotics (ROBOTIK), pp. 1–5. VDE (2010)Google Scholar
  3. 3.
    Søgaard, H.T., Lund, I.: Application accuracy of a machine vision-controlled robotic micro-dosing system. Biosyst. Eng. 96(3), 315–322 (2007)CrossRefGoogle Scholar
  4. 4.
    Saxena, A., Driemeyer, J., Ng, A.Y.: Robotic grasping of novel objects using vision. Int. J. Robot. Res. 27(2), 157–173 (2008)CrossRefGoogle Scholar
  5. 5.
    Chen, S., Li, Y., Kwok, N.M.: Active vision in robotic systems: a survey of recent developments. Int. J. Robot. Res. 30(11), 1343–1377 (2011)CrossRefGoogle Scholar
  6. 6.
    Blasco, J., Aleixos, N., Roger, J.M., Rabatel, G., Molto, E.: AE—automation and emerging technologies: robotic weed control using machine vision. Biosyst. Eng. 83(2), 149–157 (2002)CrossRefGoogle Scholar
  7. 7.
    Kragic, D., Björkman, M., Christensen, H.I., Eklundh, J.-O.: Vision for robotic object manipulation in domestic settings. Robot. Auton. Syst. 52(1), 85–100 (2005)CrossRefGoogle Scholar
  8. 8.
    Harrell, R.C., Slaughter, D.C., Adsit, P.D.: A fruit-tracking system for robotic harvesting. Mach. Vis. Appl. 2(2), 69–80 (1989)CrossRefGoogle Scholar
  9. 9.
    Blasco, J., Aleixos, N., Moltó, E.: Machine vision system for automatic quality grading of fruit. Biosyst. Eng. 85(4), 415–423 (2003)CrossRefGoogle Scholar
  10. 10.
    Jiménez, A.R., Ceres, R., Pons, J.L.: A vision system based on a laser range-finder applied to robotic fruit harvesting. Mach. Vis. Appl. 11(6), 321–329 (2000)CrossRefGoogle Scholar
  11. 11.
    Jüngel, M., Hoffmann, J., Lötzsch, M.: A real-time auto-adjusting vision system for robotic soccer. In: Polani, D., Browning, B., Bonarini, A., Yoshida, K. (eds.) Robot Soccer World Cup, pp. 214–225. Springer, Berlin (2003)Google Scholar
  12. 12.
    Mitzias, D.A., Mertzios, B.G.: A neural multiclassifier system for object recognition in robotic vision applications. Measurement 36(3–4), 315–330 (2004)CrossRefGoogle Scholar
  13. 13.
    Rasolzadeh, B., Björkman, M., Huebner, K., Kragic, D.: An active vision system for detecting, fixating and manipulating objects in the real world. Int. J. Rob. Res. 29(2–3), 133–154 (2010)CrossRefGoogle Scholar
  14. 14.
    Andrian, H., Song, K.-T.: Embedded CMOS imaging system for real-time robotic vision. In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1096–1101. IEEE (2005)Google Scholar
  15. 15.
    Faigl, J., Krajník, T., Chudoba, J., Přeučil, L., Saska, M.: Low-cost embedded system for relative localization in robotic swarms. In: 2013 IEEE International Conference on Robotics and Automation, pp. 993–998. IEEE (2013)Google Scholar
  16. 16.
    Zhu, W., Mei, B., Yan, G., Ke, Y.: Measurement error analysis and accuracy enhancement of 2D vision system for robotic drilling. Robot. Comput. Integr. Manuf. 30(2), 160–171 (2014)CrossRefGoogle Scholar
  17. 17.
    Pham, P.-H., Jelaca, D., Farabet, C., Martini, B., LeCun, Y., Culurciello, E.: NeuFlow: dataflow vision processing system-on-a-chip. In: 2012 IEEE 55th International Midwest Symposium on Circuits and Systems (MWSCAS), pp. 1044–1047. IEEE (2012)Google Scholar
  18. 18.
    Cunha, B., Azevedo, J., Lau, N., Almeida, L.: Obtaining the inverse distance map from a non-svp hyperbolic catadioptric robotic vision system. In: Visser, U., Ribeiro, F., Ohashi, T., Dellaert, F. (eds.) Robot Soccer World Cup, pp. 417–424. Springer, Berlin (2007)Google Scholar
  19. 19.
    Tarabanis, K.A., Tsai, R.Y., Allen, P.K.: The MVP sensor planning system for robotic vision tasks. IEEE Trans. Robot. Autom. 11(1), 72–85 (1995)CrossRefGoogle Scholar
  20. 20.
    Abbood, W.T., Hussein, H.K., Abdullah, O.I.: Industrial tracking camera and product vision detection system. J. Mech. Eng. Res. Dev. 42(4), 277–280 (2019)Google Scholar
  21. 21.
    Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robot. Auton. Syst. 42(3–4), 143–166 (2003)CrossRefGoogle Scholar
  22. 22.
    Tsarouchi, P., Makris, S., Chryssolouris, G.: Human–robot interaction review and challenges on task planning and programming. Int. J. Comput. Integr. Manuf. 29(8), 916–931 (2016)CrossRefGoogle Scholar
  23. 23.
    Peng, H., Briggs, J., Wang, C.-Y., Guo, K., Kider, J., Mueller, S., Baudisch, P., Guimbretière, F.: RoMA: interactive fabrication with augmented reality and a robotic 3D printer. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 579. ACM (2018)Google Scholar
  24. 24.
    Gloag, D.: What is interactive design? Examples and definition. www.study.com/academy/lesson/what-is-interactive-design-examples-definition.html. Accessed 23 Sept 2019
  25. 25.
    Batlle, J., Casals, A., Freixenet, J., Martı, J.: A review on strategies for recognizing natural objects in colour images of outdoor scenes. Image Vis. Comput. 18(6–7), 515–530 (2000)CrossRefGoogle Scholar
  26. 26.
    Boncyk, W.C., Cohen, R.H.: Image capture and identification system and process. U.S. Patent 8,326,031, issued December 4 (2012)Google Scholar
  27. 27.
    Ojha, S., Sakhare, S.: Image processing techniques for object tracking in video surveillance—A survey. In: 2015 International Conference on Pervasive Computing (ICPC), pp. 1–6. IEEE (2015)Google Scholar
  28. 28.
    Lang, S.: Object-based image analysis for remote sensing applications: modeling reality—dealing with complexity. In: Blaschke, T., Lang, S., Hay, G. (eds.) Object-Based Image Analysis, pp. 3–27. Springer, Berlin (2008)CrossRefGoogle Scholar
  29. 29.
    Nacy, S.M., Abbood, W.T.: Automated surface defect detection using area scan camera. Innov. Syst. Des. Eng. 4(8), 1–10 (2013)Google Scholar
  30. 30.
    Boncyk, W.C., Cohen, R.H.: Object information derived from object images. U.S. Patent 8,588,527, issued November 19 (2013)Google Scholar
  31. 31.
    van Assen, H.C., Egmont-Petersen, M., Reiber, J.H.C.: Accurate object localization in gray level images using the center of gravity measure: accuracy versus precision. IEEE Trans. Image Process. 11(12), 1379–1384 (2002)CrossRefGoogle Scholar
  32. 32.
    Moon, H., Chellappa, R., Rosenfeld, A.: Optimal edge-based shape detection. IEEE Trans. Image Process. 11(11), 1209–1227 (2002)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Rao, K.R., Ben-Arie, J.: Optimal edge detection using expansion matching and restoration. IEEE Trans. Pattern Anal. Mach. Intell. 16(12), 1169–1182 (1994)CrossRefGoogle Scholar
  34. 34.
    Burger, W., Burge, M.J.: Digital image processing: an algorithmic introduction using JAVA. Springer Science, New York, NY (2008)CrossRefGoogle Scholar
  35. 35.
    Petrescu, S., Gangea, M., Bigioi, P., Pososin, A., Drimbarean, A.: Color segmentation. U.S. Patent 8,055,067, issued November 8 (2011)Google Scholar
  36. 36.
    Xu, W., Mulligan, J.: Performance evaluation of color correction approaches for automatic multi-view image and video stitching. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 263–270. IEEE (2010)Google Scholar
  37. 37.
    Cho, S., Bither, W.: System and method for identifying and classifying color regions from a digital image. U.S. Patent 8,515,163, issued August 20 (2013)Google Scholar
  38. 38.
    Hanbury, A., Serra, J.: A 3d-Polar Coordinate Colour Representation Suitable for Image Analysis, PRIP. TU Wien, Tech. Rep. PRIP-TR-77 (2002)Google Scholar
  39. 39.
    Fairchild, M.D.: Color appearance models. Wiley, New York (2013)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag France SAS, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Automated Manufacturing Engineering DepartmentUniversity of BaghdadBaghdadIraq
  2. 2.Energy Engineering DepartmentUniversity of BaghdadBaghdad-AljadriaIraq
  3. 3.Hamburg University of TechnologyHamburgGermany

Personalised recommendations