Advertisement

An Effective Active Vision System for Gaze Control

  • Yann Ducrocq
  • Shahram Bahrami
  • Luc Duvieubourg
  • François Cabestaing
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5359)

Abstract

This paper presents the performances of an active vision system that mimic the human gaze control. A human can shift his gaze either by quickly moving his fixation point or by keeping a moving target in the fovea (high resolution). These two visual phenomena are called saccadic and smooth pursuit eye movements respectively. In order to mimic this human behavior, we have developed a novel active vision system based on a particular stereo-vision setup. It is composed with one camera, one prism and a set of mirrors. To point the field of view of the sensor at a target, the prism is rotated about its axis by a motorized stage. The system is designed for fast and accurate dynamical adjustments of gaze. To study the mechanical performances of our active vision system we have used three different but classical input signals. A step signal that simulates a change of target (saccadic eye movement), a velocity ramp and a sinusoidal signal that simulate a moving target (smooth pursuit). Whatever the input signal, the objective is to maintain the target in the middle of the image. The experiments demonstrate the efficiency of our vision sensor, in term of dynamical properties and measurement accuracy.

Keywords

Smooth Pursuit Active Vision Virtual Camera Saccadic Movement Active Vision System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aloimonos, J.Y., Weiss, I., Bandopadhay, A.: Active vision. International Journal of Computer Vision 1, 333–356 (1988)CrossRefGoogle Scholar
  2. 2.
    Bajcsy, R.: Active perception. Proceedings of the IEEE, Special issue on Computer Vision 76, 996–1005 (1988)Google Scholar
  3. 3.
    Ballard, D.H.: Animate vision. Artificial Intelligence 48, 57–86 (1991)CrossRefGoogle Scholar
  4. 4.
    Pahlavan, K., Eklundh, J.: A head-eye system—analysis and design. CVGIP: Image Understanding 56, 41–56 (1992)CrossRefzbMATHGoogle Scholar
  5. 5.
    Samson, E., Laurendeau, D., Parizeau, M., Comtois, S., Allan, J.F., Gosselin, C.: The agile stereo pair for active vision. Machine Vision and Applications 17, 32–50 (2006)CrossRefGoogle Scholar
  6. 6.
    Milios, E., Jenkin, M., Tsotsos, J.: Design and performance of TRISH, a binocular robot head with torsional eye movements. International journal of pattern recognition and artificial intelligence 7, 51–68 (1993)CrossRefGoogle Scholar
  7. 7.
    Clady, X., Collange, F., Jurie, F., Martinet, P.: Object tracking with a pan tilt zoom camera: application to car driving assistance. In: Proceedings of the IEEE International Conference on Robotics and Automation, ICRA 2001, Seoul, Korea, May 23-25, vol. 2, pp. 1653–1658 (2001)Google Scholar
  8. 8.
    Sharkey, P.M., Murray, D.W., Mclauchlan, P.F., Brooker, J.P.: Hardware development of the yorick series of active vision systems. Special Issue on Mobile Robots. Microprocessors and Micro Systems 21, 363–375 (1998)Google Scholar
  9. 9.
    Wavering, A.J., Fiala, J.C., Roberts, K.J., Lumia, R.: TRICLOPS: a high-performance trinocular active vision system. In: Proceeding of the IEEE International Conference on Robotics and Automation, Atlanta, USA, vol. 3, pp. 410–417 (1993)Google Scholar
  10. 10.
    Pellkofer, M., Dickmanns, E.D.: EMS-vision: gaze control in autonomous vehicles. In: Proceeding of the IEEE Intelligent Vehicles Symposium IV 2000, pp. 296–301 (2000)Google Scholar
  11. 11.
    Pellkofer, M., Lutzeler, M., Dickmanns, E.D.: Vertebrate-type Perception and Gaze Control for Road Vehicles. Robotics Research, Springer Tracts in Advanced Robotics, vol. 6, pp. 271–288. Springer, Heidelberg (2003)Google Scholar
  12. 12.
    Kammel, S., Pitzer, B., Vacek, S., Schroeder, J., Frese, C., Werling, M., Goebl, M.: Technical system description. DARPA Urban Challenge 2007, Team AnnieWAY (2007), http://annieway.mrt.uni-karlsruhe.de/
  13. 13.
    Santos-Victor, J., van Trigt, F., Sentieiro, J.: Medusa - a stereo head for active vision. In: International Symposium on Inteligent Robotic Systems, Grenoble, France (1994)Google Scholar
  14. 14.
    Kuhnlenz, K., Bachmayer, M., Buss, M.: A multi-focal high-performance vision system. In: Proceedings of the 2006 IEEE International Conference on Robotics and Automation, ICRA 2006, Orlando, Florida, USA, May 15-19, pp. 150–155 (2006)Google Scholar
  15. 15.
    Duvieubourg, L., Ambellouis, S., Cabestaing, F.: Single-camera stereovision setup with orientable optical axes. Computational Imaging and Vision, 173–178 (2005)Google Scholar
  16. 16.
    Duvieubourg, L., Cabestaing, F., Ambellouis, S., Bonnet, P.: Long distance vision sensor for driver assistance. In: Proceedings of 6th IFAC Symposium on Intelligent Autonomous Vehicles (IAV 2007) (2007)Google Scholar
  17. 17.
    Robinson, D.: The oculomotor control system: A review. Proceedings of the IEEE 56, 1032–1049 (1968)CrossRefGoogle Scholar
  18. 18.
    Bernardino, A., Silva, C., Santos-Victor, J., Pinto-Ferreira, C.: Behaviour based oculomotor control architecture for stereo heads. In: International Symposium on Intelligent Robotic Systems, Pisa, Italy (1995)Google Scholar
  19. 19.
    Kandel, E.R., Schwartz, J.H., Jessel, T.M.: Principles of neural science. Elsevier, New York (1991)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Yann Ducrocq
    • 1
  • Shahram Bahrami
    • 1
  • Luc Duvieubourg
    • 2
  • François Cabestaing
    • 2
  1. 1.Département AutomatiqueÉcole d’Ingénieurs du Pas-de-CalaisLonguenesse CedexFrance
  2. 2.Laboratoire LAGIS - UMR CNRS 8146Université des Sciences et Technologies de LilleVilleneuve d’AscqFrance

Personalised recommendations