A Survey on 3D Visual Tracking of Multicopters
Three-dimensional (3D) visual tracking of a multicopter (where the camera is fixed while the multicopter is moving) means continuously recovering the six-degree-of-freedom pose of the multicopter relative to the camera. It can be used in many applications, such as precision terminal guidance and control algorithm validation for multicopters. However, it is difficult for many researchers to build a 3D visual tracking system for multicopters (VTSMs) by using cheap and off-the-shelf cameras. This paper firstly gives an over- view of the three key technologies of a 3D VTSMs: multi-camera placement, multi-camera calibration and pose estimation for multi-copters. Then, some representative 3D visual tracking systems for multicopters are introduced. Finally, the future development of the 3D VTSMs is analyzed and summarized.
KeywordsMulticopter three-dimensional (3D) visual tracking camera placement camera calibration pose estimation
This work was supported by the National Key Research and Development Program of China (No. 2017YFB1300102) and National Natural Science Foundation of China (No. 61803025).
- D. Scaramuzza, M. C. Achtelik, L. Doitsidis, F. Friedrich, E. Kosmatopoulos, A. Martinelli, M. W. Achtelik, M. Chli, S. Chatzichristofis, L. Kneip, D. Gurdan, L. Heng, G. H. Lee, S. Lynen, M. Pollefeys, A. Renzaglia, R. Siegwart, J. C. Stumpf, P. Tanskanen, C. Troiani, S. Weiss, L. Meier. Vision-controlled micro flying robots: From system design to autonomous navigation and mapping in GPS-denied environments. IEEE Robotics & Automation Magazine, vol. 21, no. 3, pp. 26–40, 2014. DOI: https://doi.org/10.1109/MRA.2014.2322295.Google Scholar
- VICON Motion Capture Systems, [Online], Available: https://doi.org/www.vicon.com/, October 3, 2018.
- OptiTrack Motion Capture Systems, [Online], Available: https://doi.org/www.optitrack.com/, October 3, 2018.
- OptiTrack Camera Placement, [Online], Available: https://doi.org/t.cn/EhrxoJk, October 3, 2018.
- J. A. Sun, D. H. Lv, A. P. Song, T. G. Zhuang. A survey of sensor planning in computer vision. Journal of Image and Graphics, vol. 6, no. 11, pp. 1047–1052, 2001. DOI: https://doi.org/10.3969/j.issn.1006-8961.2001.11.001. (in Chinese)Google Scholar
- J. H. Kim, B. K. Koo. Convenient calibration method for unsynchronized multi-camera networks using a small reference object. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, Vilamoura, Portugal, pp. 438–444, 2012. DOI: https://doi.org/10.1109/IROS.2012.6385605.Google Scholar
- C. Theobalt, M. Li, M. A. Magnor, H. P. Seidel. A flexible and versatile studio for synchronized multi-view video recording. Vision, Video, and Graphics, P. Hall, P. Willis, Eds., Aire-la-Ville, Switzerland: Eurographics, pp. 9–16, 2003.Google Scholar
- T. Ueshiba, F. Tomita. Plane-based calibration algorithm for multi-camera systems via factorization of homography matrices. In Proceedings of the 9th IEEE International Conference on Computer Vision, IEEE, Nice, France, pp. 966–973, 2003. DOI: https://doi.org/10.1109/ICCV.2003.1238453.Google Scholar
- J. Mitchelson, A. Hilton. Wand-based Multiple Camera Studio Calibration, Technical Report. VSSP-TR-2, Centre for Vision, Speech and Signal Processing, University of Surrey, UK, 2003.Google Scholar
- G. Kurillo, Z. Y. Li, R. Bajcsy. Wide-area external multi-camera calibration using vision graphs and virtual calibration object. In Proceedings of 2nd ACM/IEEE International Conference on Distributed Smart Cameras, IEEE, Stanford, USA, 2008. DOI: https://doi.org/10.1109/ICDSC.2008.4635695.Google Scholar
- Q. Fu, Q. Quan, K. Y. Cai. Multi-camera calibration based on freely moving one dimensional object. In Proceedings of the 30th Chinese Control Conference, IEEE, Yantai, China, pp. 5023–5028, 2011.Google Scholar
- M. C. Villa-Uriol, G. Chaudhary, F. Kuester, T. Hutchinson, N. Bagherzadeh. Extracting 3D from 2D: Selection basis for camera calibration. In Proceedings of the 7th IASTED International Conference on Computer Graphics and Imaging, IASTED, Kauai, USA, pp. 315–321, 2004.Google Scholar
- L. Kneip, D. Scaramuzza, R. Siegwart. A novel parametrization of the perspective-three-point problem for a direct computation of absolute camera position and orientation. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Colorado Springs, USA, pp. 2969–2976, 2011. DOI: https://doi.org/10.1109/CVPR.2011.5995464.Google Scholar
- M. Ficocelli, F. Janabi-Sharifi. Adaptive filtering for pose estimation in visual servoing. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the Next Millennium, IEEE, Maui, USA, pp. 19–24, 2001. DOI: https://doi.org/10.1109/IROS.2001.973330.Google Scholar
- N. T. Rasmussen, M. Storring, T. B. Moeslund, E. Granum. Real-time tracking for virtual environments using SCAAT Kalman filtering and unsynchronised cameras. In Proceedings of the 1st International Conference on Computer Vision Theory and Applications, Institute for Systems and Technologies of Information, Control and Communication, Setubal, Portugal, pp. 333–341, 2006. DOI: https://doi.org/10.5220/0001367803330340.Google Scholar
- S. Lupashin, A. Schollig, M. Sherback, R. D’Andrea. A simple learning strategy for high-speed quadrocopter multi-flips. In Proceedings of IEEE International Conference on Robotics and Automation, IEEE, Anchorage, USA, pp. 1642–1648, 2010. DOI: https://doi.org/10.1109/ROBOT.2010.5509452.Google Scholar
- S. Lupashin, M. Hehn, M. W. Mueller, A. P. Schoellig, M. Sherback, R. D’Andrea. A platform for aerial robotics research and demonstration: The flying machine arena. Mechatronics, vol. 24, no. 1, pp. 41–54, 2014. DOI: https://doi.org/10.1016/j.mechatronics.2013.11.006.Google Scholar
- R. Oung, R. D’Andrea. The distributed flight array. Mechatronics, vol. 21, no. 6, pp. 908–917, 2011. DOI: https://doi.org/10.1016/j.mechatronics.2010.08.003.Google Scholar
- M. Furci, G. Casadei, R. Naldi, R. G. Sanfelice, L. Marconi. An open-source architecture for control and coordination of a swarm of micro-quadrotors. In Proceedings of International Conference on Unmanned Aircraft Systems, IEEE, Denver, USA, pp. 139–146, 2015. DOI: https://doi.org/10.1109/ICUAS.2015.7152285.Google Scholar
- The Crazyflie 1.0, [Online], Available: https://doi.org/www.bitcraze.io/crazyflie/, October 3, 2018.
- Autonomous Vehicles Research Studio, [Online], Available: https://doi.org/www.quanser.com/products/autonomous-vehicles-research-studio/, October 3, 2018.
- D. Y. Won, H. Oh, S. S. Huh, D. H. Shim, M. J. Tahk. Multiple UAVs tracking algorithm with a multi-camera system. In Proceedings of International Conference on Control Automation and Systems, IEEE, Gyeonggido, South Korea, pp. 2357–2360, 2010. DOI: https://doi.org/10.1109/ICCAS.2010.5669931.Google Scholar
- Q. Fu. Research on Robust 3D Visual Tracking of Multirotor Aerial Vehicles, Ph. D. dissertation, Beihang University (formerly Beijing University of Aeronautics and Astronautics), Beijing, China, 2016. (In Chinese)Google Scholar
- A. Elhayek, C. Stoll, N. Hasler, K. I. Kim, H. P. Seidel, C. Theobalt. Spatio-temporal motion tracking with unsynchronized cameras. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Providence, USA, pp. 1870–1877, 2012. DOI: https://doi.org/10.1109/CVPR.2012.6247886.Google Scholar
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.
To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0.