Video as Input: Spiral Search with the Sparse Angular Sampling

  • Tatiana V. Evreinova
  • Grigori Evreinov
  • Roope Raisamo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4263)


This paper presents an improved cross-correlation algorithm for template-based object tracking: the reduced spiral search with a sparse angular sampling. The basic parameters of the algorithm for the real-time face tracking were evaluated regarding their impact on the algorithm performance. They are the minimum number of pixels and the size of the template, the correlation threshold and drifting, and the parameters of the search – radius, shift, direction, and rotation of the template. We demonstrated that the information provided by the grid-like template might be reduced to 16 pixels with a grid step of 15 pixels. A spiral search in 8 directions with a minimum shift of 1 pixel decreases the number of computations by 20 times. Being activated sequentially the template rotation does not increase the performance, but doing the tracking adaptive and robust.


Algorithm Performance Search Area Correlation Threshold Facial Landmark Sample Candidate 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bérard, F.: The perceptual window: Head motion as a new input stream. In: IFIP Conference on Human-Computer Interaction, pp. 238–244 (1999)Google Scholar
  2. 2.
    Betke, M., Gips, J., Fleming, P.: The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access For People with Severe Disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(1), 1–10 (2002)CrossRefGoogle Scholar
  3. 3.
    Blalock, T.N., et al.: Method and Device for Tracking Relative Movement by Correlating Signals From an Array of Photoelements, US Patent 5729008 (1998)Google Scholar
  4. 4.
    Brunelli, R., Poggio, T.: Face recognition: features versus templates. IEEE Trans. Pattern Analysis and Machine Intelligence 15(10), 1042–1052 (1993)CrossRefGoogle Scholar
  5. 5.
    Comaniciu, D., Ramesh, V., Meer, P.: Real-Time Tracking of Non-Rigid Objects using Mean Shift. In: IEEE Conf. Computer Vision and Pattern Recognition (CVPR 2000), vol. 2, pp. 142–149 (2002)Google Scholar
  6. 6.
    Crowley, J.L., Berard, F., Coutaz, J.: Finger Tracking as an Input Device for Augmented Reality. In: Int. Workshop on Face and Gesture Recognition, pp. 195–200 (1995)Google Scholar
  7. 7.
    FaceMOUSE. Product information. Web site (2005),
  8. 8.
    Intel Image Processing library Open CV,
  9. 9.
    Jilin Tu, T., Huang, T., Tao, H.: Face As Mouse Through Visual Face Tracking. In: Proc. of the 2nd Canadian Conf Computer and Robot Vision, pp. 339–346 (2005)Google Scholar
  10. 10.
    Lewis, J.P.: Fast Template Matching. Vision Interface, 120–123 (1995)Google Scholar
  11. 11.
    Product information on Website (2005),
  12. 12.
    Zhang, S.-C., Liu, Z.-Q.: A robust, real-time ellipse detector. J. Pattern Recognition 38, 273–287 (2005)MATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Tatiana V. Evreinova
    • 1
  • Grigori Evreinov
    • 1
  • Roope Raisamo
    • 1
  1. 1.TAUCHI Computer-Human Interaction Unit, Department of Computer SciencesUniversity of TampereFinland

Personalised recommendations