Gaze Direction Determination of Opponents and Teammates in Robot Soccer

  • Patricio Loncomilla
  • Javier Ruiz-del-Solar
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4020)


Gaze direction determination of opponents and teammates is a very important ability for any soccer player, human or robot. However, this ability is still not developed in any of the RoboCup soccer leagues. We aim at reverting this situation by proposing a gaze direction determination system for robot soccer; the system is designed primarily for the four-legged league, but it could be extended to other leagues. This system is based on a robot-head pose detection system, consisting on two main processing stages: (i) computation of scale-invariant local descriptors of the observed scene, and (ii) matching of these descriptors against descriptors of robot-head prototypes already stored in a model database. After the robot-head pose is detected, the robot gaze direction is determined using a head model of the observed robot, and the current 3D position of the observing robot camera. Experimental results of the proposed approach are presented.


Soccer Player Interest Point Affine Transformation Scale Invariant Feature Transform Orientation Histogram 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Lowe, D.G.: Distinctive Image Features from Scale-Invariant Keypoints. Int. Journal of Computer Vision 60(2), 91–110 (2004)CrossRefGoogle Scholar
  2. 2.
    Brown, M., Lowe, D.G.: Invariant Features from Interest Point Groups. In: British Machine Vision Conference - BMVC 2002, Cardiff, Wales, pp. 656–665 (September 2002)Google Scholar
  3. 3.
    Bakic, V., Stockman, G.: Real-time Tracking of Face Features and Gaze Direction Determination. In: 4th IEEE Workshop on Applications of Computer Vision – WACV 1998, Princeton, USA, pp. 256–257, October 19-21 (1998)Google Scholar
  4. 4.
    Perez, A., Cordoba, M.L., Garcia, A., Mendez, R., Munoz, M.L., Pedraza, J.L., Sanchez, F.: A Precise Eye-Gaze Detection and Tracking System. In: 11th Int. Conf. in Central Europe on Computer Graphics, Visualization and Computer Vision - WSCG 2003, Plzen - Bory, Czech Republic, February 3-7 (2003)Google Scholar
  5. 5.
    Harris, C., Stephens, M.: A combined corner and edge detector. In: 4th Alvey Vision Conf., pp. 147–151, Manchester, UK (1988)Google Scholar
  6. 6.
    Schaffalitzky, F., Zisserman, A.: Automated location matching in movies. Computer Vision and Image Understanding 92(2-3), 236–264 (2003)CrossRefGoogle Scholar
  7. 7.
    Mikolajczyk, K., Schmid, C.: Scale & Affine Invariant Interest Point Detectors. Int. Journal of Computer Vision 60(1), 63–96 (2004)CrossRefGoogle Scholar
  8. 8.
    Ji, Q., Yang, X.: Real-Time Eye, Gaze, and Face Pose Tracking for Monitoring Driver Vigilance. Real-Time Imaging 8, 357–377 (2002)MATHCrossRefGoogle Scholar
  9. 9.
    Ohno, T., Mukawa, N., Yoshikawa, A.: FreeGaze: A Gaze Tracking System for Everyday Gaze Interaction. In: Symposium on Eye Tracking Research and Applications, pp. 125–132 (2002)Google Scholar
  10. 10.
    Beis, J., Lowe, D.G.: Shape Indexing Using Approximate Nearest-Neighbor Search in High-Dimensional Spaces. In: IEEE Conf. Comp. Vision Patt. Recog., pp. 1000–1006 (1997)Google Scholar
  11. 11.
    Lowe, D.G.: Local Features View Clustering for 3D Object Recognition. In: Proc. of the IEEE Conf. on Comp., Hawai, Dic., pp. 682–688 (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Patricio Loncomilla
    • 1
  • Javier Ruiz-del-Solar
    • 1
    • 2
  1. 1.Department of Electrical EngineeringUniversidad de Chile 
  2. 2.Center for Web Research, Department of Computer ScienceUniversidad de Chile 

Personalised recommendations