Enhancing the Point Feature Tracker by Adaptive Modelling of the Feature Support
We consider the problem of tracking a given set of point features over large sequences of image frames. A classic procedure for monitoring the tracking quality consists in requiring that the current features nicely warp towards their reference appearances. The procedure recommends focusing on features projected from planar 3D patches (planar features), by enforcing a conservative threshold on the residual of the difference between the warped current feature and the reference. However, in some important contexts, there are many features for which the planarity assumption is only partially satisfied, while the true planar features are not so abundant. This is especially true when the motion of the camera is mainly translational and parallel to the optical axis (such as when driving a car along straight sections of the road), which induces a permanent increase of the apparent feature size. Tracking features containing occluding boundaries then becomes an interesting goal, for which we propose a multi-scale monitoring solution striving to maximize the lifetime of the feature, while also detecting the tracking failures. The devised technique infers the parts of the reference which are not projected from the same 3D surface as the patch which has been consistently tracked until the present moment. The experiments on real sequences taken from cars driving through urban environments show that the technique is effective in increasing the average feature lifetimes, especially in sequences with occlusions and large photometric variations.
KeywordsMedian Absolute Deviation Adaptive Modelling Visual Servoing Error Image Feature Support
Unable to display preview. Download preview PDF.
- 2.Nistér, D., Naroditsky, O., Bergen, J.: Visual odometry. In: Proc. of CVPR, pp. 652–659. IEEE, Washington (2004)Google Scholar
- 3.Davison, A.: Real-time simultaneous localisation and mapping with a single camera. In: Proc. of ICCV, Nice, France, pp. 1403–1410 (2003)Google Scholar
- 4.Malis, E., Chaumette, F., Boudet, S.: 2 1/2 D visual servoing. IEEE Trans. RA 15, 234–246 (1999)Google Scholar
- 5.Shi, J., Tomasi, C.: Good features to track. In: Proc. of CVPR, pp. 593–600 (1994)Google Scholar
- 6.Jin, H., Favaro, P., Soatto, S.: Real-time feature tracking and outlier rejection with changes in illumination. In: Proc. of ICCV, vol. 1, pp. 684–689 (2001)Google Scholar
- 8.Baker, S., Matthews, I.: Lucas-Kanade 20 years on: A unifying framework. Int. J. Comput. Vis. 56, 221–255 (2004)Google Scholar
- 9.Kenney, C., Manjunath, B., Zuliani, M., Hewer, G., van Nevel, A.: A condition number for point matching with application to registration and postregistration error estimation. IEEE Trans. PAMI 25, 1437–1454 (2003)Google Scholar
- 12.Matthews, I., Ishikawa, T., Baker, S.: The template update problem. In: Proc. of British Machine Vision Conference (2003)Google Scholar
- 13.Nguyen, H.T., Smeulders, A.W.M.: Fast occluded object tracking by a robust appearance filter. IEEE Trans. PAMI 26, 1099–1104 (2004)Google Scholar
- 14.Darrell, T., Covell, M.: Correspondence with cumulative similarity transforms. IEEE Trans. PAMI 23, 222–227 (2001)Google Scholar
- 15.Loutas, E., Diamantaras, K., Pitas, I.: Occlusion resistant object tracking. In: Proc. of ICIP, pp. II:65–68 (2001)Google Scholar
- 16.Stauffer, C., Grimson, W.: Adaptive background mixture models for real-time tracking. In: Proc. of CVPR, pp. II:246–252. IEEE, Los Alamitos (1999)Google Scholar