Advertisement

Parabolic Flight Reconstruction from Multiple Images from a Single Camera in General Position

  • Raúl Rojas
  • Mark Simon
  • Oliver Tenchio
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4434)

Abstract

This paper shows that it is possible to retrieve all parameters of the parabolic flight trajectory of an object from a time stamped sequence of images captured by a single camera looking at the scene. Surprisingly, it is not necessary to use two cameras (stereo vision) in order to determine the coordinates of the moving object with respect to the floor. The technique described in this paper can thus be used to determine the three-dimensional trajectory of a ball kicked by a robot. The whole calculation can be done, at the limit, with just three measurements of the ball position captured in three consecutive frames. Therefore, this technique can be used to forecast the future motion of the ball a few milliseconds after the kick has taken place. The computation is fast and allows a robot goalie to move to the correct blocking position. Interestingly, this technique can also be used to self-calibrate stereo cameras.

Keywords

Multiple Image Ball Position Single Camera Goal Line Ball Trajectory 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Howard, I.P., Howard, A.P., Rogers, B.: Binocular Vision and Stereopsis. Oxford University Press, Oxford (1995)Google Scholar
  2. 2.
    Forsythe, D., Ponce, J.: Computer Vision: A Modern Approach. Prentice-Hall, Englewood Cliffs (2003)Google Scholar
  3. 3.
    Kim, T., Seo, Y., Hong, K.-S.: Physics-based 3D position analysis of a soccer ball from monocular image sequence. In: ICCV 1998, Bombay, India, pp. 721–726 (January 1998)Google Scholar
  4. 4.
    Simon, M., Behnke, S., Rojas, R.: Robust Real Time Color Tracking. In: Stone, P., Balch, T., Kraetzschmar, G.K. (eds.) RoboCup 2000. LNCS (LNAI), vol. 2019, pp. 62–71. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  5. 5.
    Simon, M., Wiesel, F., Egorova, A., Gloye, A., Rojas, R.: Plug & Play: Fast Automatic Geometry and Color Calibration for Tracking Mobile Robots. In: Nardi, D., Riedmiller, M., Sammut, C., Santos-Victor, J. (eds.) RoboCup 2004. LNCS (LNAI), vol. 3276, Springer, Heidelberg (2005)Google Scholar
  6. 6.
    Hartley, R., Gupta, R., Chang, T.: Stereo from Uncalibrated Cameras. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 761–764. IEEE Computer Society Press, Los Alamitos (1992)CrossRefGoogle Scholar
  7. 7.
    Hartley, R.I., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press, Cambridge (2000)zbMATHGoogle Scholar
  8. 8.
    Luong, Q.-T., Faugeras, O.: Self-calibration of a stereo rig from unkown camera motions and point correspondences. INRIA, Rapport de Recherche 2014 (July 1993)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Raúl Rojas
    • 1
  • Mark Simon
    • 1
  • Oliver Tenchio
    • 1
  1. 1.Institut für Informatik, Freie Universität Berlin, Takustr. 9, 14195 BerlinGermany

Personalised recommendations