Skip to main content
Log in

Lucid Workspace for Stereo Vision

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

In this paper a novel error analysis on the well-known “stereo vision” system used in robots is presented. Based on this analysis, a new concept called the “lucid workspace” is introduced. Definition of such an area in the robot workspace is useful in vision system design. It enables us to know a priori the area that the robot can see in a crystal clear manner, so the designer can select the specifications and relative position of the vision system cameras to cover the needs. First, the basic two-camera triangulation equations are used to calculate a 3D object position, error propagation equations are derived by taking the appropriate partial derivatives with respect to all measurement errors. Next, using the applicable assumptions, two theorems are obtained and a new algorithm for determination of the lucid workspace is presented by using them. The performance of the new algorithm is verified using Monte Carlo simulations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Abbreviations

2d :

baseline

(x 0,y 0,z 0,):

position of object

(x 1,y 1,z 1,):

position of camera 1

(x 2,y 2,z 2,):

position of camera 2

ψ o i :

azimuth angle measured by each camera (i = 1, 2) 𝜃 o i elevation angle measured by each camera (i = 1, 2)

r o i :

distance between object and each camera on XY plane (i = 1, 2)

δ :

maximum position estimation error of object for all directions

δ x :

maximum position error of each camera in X direction

δ y :

maximum position error of each camera in Y direction

δ z :

maximum position error of each camera in Z direction

δ ψ :

maximum measurement error of each camera in azimuth angle

δ 𝜃 :

maximum measurement error of each camera in elevation angle

δ m i n :

minimum of position estimation error of object in all directions

δ x :

a small change in variable x

α v :

vertical angle of view

α h :

horizontal angle of view

α d :

diagonal angle of view

H :

height of camera sensor

W :

width of camera sensor

N H :

number of pixels in height of camera sensor

N W :

number of pixels in width of camera sensor

h :

height of each pixel of the camera sensor

w :

width of each pixel of the camera sensor

F :

focal length of camera

𝜃 m a x :

maximum of measureable cameras elevation angle

k :

absolute value of tangent of 𝜃 m a x

References

  1. Kriegman, D.J., Triendl, E., Binford, T.O.: Stereo Vision and Navigation in Buildings for Mobile Robots. IEEE Transaction on Robotics and Automation 5 (6), 792–793 (1989)

    Article  Google Scholar 

  2. Lee, J.S., Lim, J., Bien, Z.: Simple Stereo Algorithm for 3D Positioning of Object Points. Electron. Lett. 24 (18), 1139–1140 (1988)

    Article  Google Scholar 

  3. Weng, J., Cohen, P., Rebibo, N.: Motion and Structure Estimation from Stereo Image Sequences. IEEE Transaction on Robotics and Automation 8 (3), 362–382 (1992)

    Article  Google Scholar 

  4. Yang, C.C., Marefat, M.M., Ciarallo, F.W.: Error analysis and planning accuracy for dimensional measurement in active vision inspection. IEEE Transactions on Robotics and Automation 14 (3), 476–487 (1998)

    Article  Google Scholar 

  5. Pachidis, T.P., Lygouras, J.N.: Pseudostereo-vision system: a monocular stereo-vision system as a sensor for real-time robot applications. IEEE Transaction on Instrumentation and Measurement 56 (6), 2547–2560 (2007)

    Article  Google Scholar 

  6. Rovira-Ma’s, F., Wang, Q., Zhang, Q.: Design parameters for adjusting the visual field of binocular stereo cameras. Bio Systems Engineering 105, 59–70 (2010)

    Google Scholar 

  7. Bazeille, S., Barasuol, V., Focchi, M., Havoutis, I., Frigerio, M., Buchli, J., Caldwell, D.G., Semini, C.: Quadruped robot trotting over irregular terrain assisted by stereo-vision. Springer Special Issue on Intelligent Service Robotics. Special Issue (2014)

  8. García Carrillo, L.R., López, A.E.D., Lozano, R., Pégard, C.: Combining stereo vision and inertial navigation system for a quad-rotor UAV. Springer Journal of Intelligent and Robotic Systems 65 (1–4), 373–387 (2012)

    Article  Google Scholar 

  9. Asadi, E., Bottasso, C.L.: Tightly-coupled stereo vision-aided inertial navigation using feature-based motion sensors. Taylor & Francis in Advanced Robotics (2013)

  10. Iamrurksiri, A., Tsubouchi, T., Sarata, S.: Rock recognition using stereo vision for large rock breaking operation. Springer Tracts in Advanced Robotics 92, 383–397 (2014)

    Article  Google Scholar 

  11. Vahrenkamp, N., Boge, C., Welke, K., Asfour, T., Walter, J., Dillmann, R. : Visual servoing for dual arm motions on a humanoid robot. In: 9th IEEE-RAS International Conference on Humanoid Robots, pp. 208-214 (2009)

  12. Wikipedia: Triangulation. http://en.wikipedia.org/wiki/Triangulation (2013)

  13. Sanders-Reed, J.N.: Error propagation in two-sensor three-dimensional position estimation. Opt. Eng. 40 (4), 627–636 (2001)

    Article  Google Scholar 

  14. Ito, M., Tsujimichi, S., Kosuge, Y.: Tracking a three-dimensional moving target with two-dimensional angular measurements from multiple passive sensors. In: 38th Annual Conference Proceedings of the SICE, pp. 1117-1122 (1999)

  15. Panerai, F., Metta, G., Sandini, G.: Visuo-inertial stabilization in space-variant binocular systems. Robot. Auton. Syst. 30, 195–214 (2000)

    Article  Google Scholar 

  16. Gasteratos, A., Beltran, C., Metta, G., Sandini, G.: PRONTO: a system for mobile robot navigation via cad-model guidance. Microprocess. Microsyst. 26, 17–26 (2002)

    Article  Google Scholar 

  17. 4D systems. www.4dsystems.com.au (2012)

  18. Pachidis, T. P., Lygouras, J. N.: Pseudostereo-vision system: a monocular stereo-vision system as a sensor for real-time robot applications. IEEE Transactions on Instrumentation and Measurement 56 (6), 2547–2560 (2007)

    Article  Google Scholar 

  19. Sol‘a, J., Monin, A., Devy, M., Vidal-Calleja, T. : Fusing monocular information in multicamera SLAM. IEEE Trans. Robot. 24 (5), 958–968 (2008)

    Article  Google Scholar 

  20. De Silva, D.V.S.X., Ekmekcioglu, E., Fernando, W.A.C., Worrall, S.T.: Display dependent preprocessing of depth maps based on just noticeable depth difference modeling. IEEE Journal of Selected Topics in Signal Processing 5 (2), 335–351 (2011)

    Article  Google Scholar 

  21. Kim, W.-J., Kim, S.-D., Kim, J.: Analysis on the spectrum of a stereoscopic 3-D image and disparity-adaptive anti-aliasing filter. IEEE Transactions on Circuits and Systems for Video Technology 19 (10), 1561–1565 (2009)

    Article  Google Scholar 

  22. Ferre, M., Aracil, R., Sanchez-Uran, M.: Stereoscopic Human Interfaces. IEEE Robot. Autom. Mag., 50–57 (2008)

  23. Caraffi, C., Cattani, S., Grisleri, P.: Off-road path and obstacle detection using decision networks and stereo vision. IEEE Transactions on Intelligent Transportation Systems 8 (4), 607–618 (2007)

    Article  Google Scholar 

  24. Matthies, L., Gat, E., Harrison, R., Wilcox, B., Volpe, R., Litwin, T.: Mars microrover navigation: performance evaluation and enhancement. Auton. Robot 2, 291–311 (1995)

    Article  Google Scholar 

  25. Matthies, L., Maimone, M., Johnson, A., Cheng, Y., Willson, R., Villalpando, C., Goldberg, S., Huertas, A.: Computer vision on mars. Int. J. Comput. Vis. 75 (1), 67–92 (2007)

    Article  Google Scholar 

  26. Ayache, N., Lustman, F.: Trinocular stereo vision for robotics. IEEE Transactions on Pattern Analysis and Machine Intelligence 13 (1), 73–85 (1991)

    Article  Google Scholar 

  27. Jenniskens, P., Gural, P.S., Dynneson, L., Grigsby, B.J., Newmane, K.E., Borden, M., Koop, M., Holman, D.: CAMS: Cameras for Allsky Meteor Surveillance to establish minor meteor showers. Icarus 216 (1), 40–61 (2011)

    Article  Google Scholar 

  28. Wikipedia:image sensor format, http://en.wikipedia.org/wiki/Image_sensor_format#Table_of_sensor_formats_and_sizes (2013)

  29. Wikipedia: Angle of View, http://en.wikipedia.org/wiki/Angle_of_view (2013)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Morteza Rezaei.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rezaei, M., Ozgoli, S. Lucid Workspace for Stereo Vision. J Intell Robot Syst 78, 223–237 (2015). https://doi.org/10.1007/s10846-014-0083-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-014-0083-0

Keywords

Navigation