Vision Based Simulation and Experiment for Performance Tests of Robot

Original Article

Abstract

The feature-based visual servoing approach has been used to control robot through vision. In order to find the position of the end effector by vision and through robot performance tests, computational kinematic approach has been used. The software carries out the duty of environment simulation and operation of an industrial robot. The disputes related to image capturing, image processing, target recognition, and how to control robot by vision system ability have been carried out in the simulation tests. The vision based program has been defined in such a way that it can be carried out by a real robot with the least changes.

In the experiment, the vision system will recognize the target and control the robot by obtaining images from environment and processing them. At the first stage, images from environment are changed to a grayscale mode then it can diverse and identify objects and noises by using a threshold objects which are stored in different frames and then the main object will be recognized. This will control the robot to achieve the target. Finally, the issues of robot performance tests based on the two standards ISO 9283 and ANSI-RIA R15.05-2 have been accomplished through simulator program using vision system over the 3P robot for evaluating the path-related characteristics of the robot. To evaluate the performance of the proposed method experimental test is carried out.

Keywords

Performance tests Vision robot Simulation Experiment  

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    T.W. Miller III, “Neural Networks for Sensor Based Control of Robots with Vision”, IEE Trans. on Systems Man and Cybernetics, Vol.19, No.4, 1989 pp. 826–831.Google Scholar
  2. 2.
    J. Wu, K. Stanley, “Modulare Neural-Visual Servoing Using a Neural-Fuzzy Decision Network”: IEEE Conference on Robotics and Automation, Albuquerque, 1997, pp. 3238–3243.Google Scholar
  3. 3.
    J.T. Feddema, C .S. G Lee, O.R. Mitchell, “Weighted Selection of Image Features for Resolved Rate Visual Feedback Control”: IEEE Transaction on Robotics and Automation, Vol. 7, No 1, 1991, pp. 31–47.Google Scholar
  4. 4.
    D. Anguita, G. Parodi, R. Zunio, “Neural Structures for Visual Motion Tracking”, Machine Vision and Applications, August 1995.Google Scholar
  5. 5.
    J. Wu, K. Stanley, “Modulare Neural-Visual Servoing using a Neural-Fuzzy Decision Network”: IEEE Conference on Robotics and Automation, Albuquerque.Google Scholar
  6. 6.
    Ching-Cheng Wang, “Extrinsic Calibration of a Vision Sensor Mounted on a Robot”: IEEE Trans. on Robotics and Automation, Vol. 8, No. 2, April 1992, pp. 161–175.Google Scholar
  7. 7.
    Guo-Qing Wei, Klaus Arbter, and Gerd Hirzinger, “Active Self-Calibration of Robotic Eyes and Hand-Eye Relationships with Model Identification, ” IEEE Trans. on Robotics and Automation, Vol. 14, No. 1, February, 1998, pp. 158–166.Google Scholar
  8. 8.
    Hanqi Zhuang, Kuanchih Wang, Zvi S. Roth, “Simultaneous Calibration of a Robot and a Hand-Mounted Camera, ” IEEE Trans. on Robotics and Automation, Vol. 11, No. 5, 1995, pp. 649–660.Google Scholar
  9. 9.
    Pan, T.J. and Luo, R. C, “motion obstacle”, Proc. IEEE Int. Conf. On Robotics and Autom., Cincinnati, Ohio, pp. 573–583, 1990.Google Scholar
  10. 10.
    Yun, X. and Sarkar, N., “dynamic Feedback Control of Vehicles with Two Steer able Wheels”: Proc. IEEE Int. Conf. On Robotics and Autom., Minneapolis, Minnesota, pp. 3105–3110, 1996.Google Scholar
  11. 11.
    Hiros, S., Fukushima, E.F. and Tsukagoshi, S. I., “Basic Steering Control Methods for the Articulated Body Mobile Robot”: Proc. IEEE Int. Conf. On Robotics and Autom., San Diego, California, pp. 2384–2390, 1994.Google Scholar
  12. 12.
    M.H. Korayem, “A Robotic Software for Autonomous Wheeled Mobile Robot” Int. J. of Engineering, Vol. 12, No. 2, pp. 151–162, 2001.Google Scholar
  13. 13.
    Green, D. N., Sasiadek, J.Z. and Vukovich, G. S., “path tracking, obstacle avoidance and position estimation by an autonomous, wheeled planetary rover”: Proc. IEEE Int. Conf. On Robotics and Autom., San Diego, California, pp. 1300–1305, 1994.Google Scholar
  14. 14.
    A. Hajjan, “A new Real time Edge Linking Algorithm and its VLSI Implementation”, Colorado State University.Google Scholar
  15. 15.
    Kurt Konolige, “Small Vision Module”Google Scholar
  16. 16.
    American National Standard for Industrial Robots and Robot Systems Path-Related and Dynamic Performance Characteristics Evaluation. ANSI/RIA R15.05-2. Apr. 2000.Google Scholar
  17. 17.
    M.H. Korayem, H. Aliakbarpour, “Simulating of Vision System and Presenting Image Processing Algorithms for Robot”, 2nd Vision Conference, 2003.Google Scholar

Copyright information

© Springer-Verlag 2004

Authors and Affiliations

  1. 1.Robotic Research Laboratory, Mechanical Engineering DepartmentIran University of Science & TechnologyTehranIran

Personalised recommendations