Global optimal searching for textureless 3D object tracking
- 415 Downloads
Textureless 3D object tracking of the object’s position and orientation is a considerably challenging problem, for which a 3D model is commonly used. The 3D–2D correspondence between a known 3D object model and 2D scene edges in an image is standardly used to locate the 3D object, one of the most important problems in model-based 3D object tracking. State-of-the-art methods solve this problem by searching correspondences independently. However, this often fails in highly cluttered backgrounds, owing to the presence of numerous local minima. To overcome this problem, we propose a new method based on global optimization for searching these correspondences. With our search mechanism, a graph model based on an energy function is used to establish the relationship of the candidate correspondences. Then, the optimal correspondences can be efficiently searched with dynamic programming. Qualitative and quantitative experimental results demonstrate that the proposed method performs favorably compared to the state-of-the-art methods in highly cluttered backgrounds.
Keywords3D tracking 3D–2D correspondence Global optimization Dynamic programming
The authors gratefully acknowledge the anonymous reviewers for their comments to help us to improve our paper, and also thank for their enormous help in revising this paper. This work is supported by 973 program of China (No. 2015CB352500), 863 program of China (No. 2015AA016405), and NSF of China (Nos. 61173070, 61202149).
- 4.Dambreville, S., Sandhu, R., Yezzi, A., Tannenbaum, A.: Robust 3d pose estimation and efficient 2d region-based segmentation from a 3d shape prior. In: Proceedings of European Conference on Computer Vision, pp 169–182 (2008)Google Scholar
- 6.Harris, C., Stennett, C.: Rapid: a video-rate object tracker. In: Proceedings of British Machine Vision Conference, pp. 73–77 (1990)Google Scholar
- 7.Hinterstoisser, S., Benhimane, S., Navab, N.: N3m: natural 3d markers for real-time object detection and pose estimation. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1–7 (2007)Google Scholar
- 9.Izadi, S., Kim, D., Hilliges, O., Molyneaux, D., Newcombe, R., Kohli, P., Shotton, J., Hodges, S., Freeman, D., Davison, A., Fitzgibbon, A.: Kinectfusion: real-time 3d reconstruction and interaction using a moving depth camera. ACM Symposium on User Interface Software and Technology (2011)Google Scholar
- 10.Kato, H., Billinghurst, M.: Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In: IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 85–94 (1999)Google Scholar
- 12.Klein, G., Murray, D.: Full-3d edge tracking with a particle filter. In: Proceedings of British Machine Vision Conference, pp. 114.1-114.10 (2006)Google Scholar
- 18.Rosten, E., Drummond, T.: Fusing points and lines for high performance tracking. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1508–1515 (2005)Google Scholar
- 21.Vacchetti, L., Lepetit, V., Fua, P.: Combining edge and texture information for real-time accurate 3d camera tracking. In: IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 48–56 (2004)Google Scholar
- 23.Wuest, H., Vial, F., Stricker, D.: Adaptive line tracking with multiple hypotheses for augmented reality. In: IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 62–69 (2005)Google Scholar