Abstract
Several visual following robots have been proposed in recent years. However, many require the use of several, expensive sensors and often the majority of the image processing and other calculations are performed off-board. This paper proposes a simple and cost effective, yet robust visual following robot capable of tracking a general object with limited restrictions on target characteristics. To detect the objects, tracking-learning-detection (TLD) is used within a Bayesian framework to filter and fuse the measurements. A time-of-flight (ToF) depth camera is used to refine the distance estimates at short ranges. The algorithms are executed in real-time (approximately 30 fps) in a Jetson TK1 embedded computer. Experiments were conducted with different target objects to validate the system in scenarios including occlusions and various illumination conditions as well as to show how the data fusion between TLD and the ToF camera improves the distance estimation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
Although other ToF cameras such as the classic SR4000 from MESA imaging, the PMD CamCube 3.0 or SoftKinetic’s DS536A have ranges of up to 5 m, the low-cost and lightweight Senz3D was deemed sufficient for our purposes.
- 3.
Note that the set points \(s_{p_u}\) and \(s_{p_z}\) correspond to the desired target position with respect to the robot, not to the actual robot position. The controllers use the set points to move the robot so that the difference between the estimated position and the set point is minimized.
References
Bonin-Font, F., Ortiz, A., Oliver, G.: Visual navigation for mobile robots: a survey. J. Intell. Rob. Syst. 53, 263–296 (2008)
Jung, B., Sukhatme, G.S.: Real-time motion tracking from a mobile robot. Int. J. Soc. Robot. 2, 63–78 (2010)
Papanikolopoulos, N.P., Khosla, P.K., Kanade, T.: Visual tracking of a moving target by a camera mounted on a robot: a combination of control and vision. IEEE Trans. Robot. Autom. 9, 14–35 (1993)
Ahrens, S., Levine, D., Andrews, G., How, J.P.: Vision-based guidance and control of a hovering vehicle in unknown, GPS-denied environments. In: IEEE International Conference on Robotics and Automation, ICRA 2009, pp. 2643–2648 (2009)
Fowers, S.G., Lee, D.J., Tippetts, B.J., Lillywhite, K.D., Dennis, A.W., Archibald, J.K.: Vision aided stabilization and the development of a quad-rotor micro UAV. In: International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2007, pp. 143–148 (2007)
Kwon, H., Yoon, Y., Park, J.B., Kak, A.C.: Person tracking with a mobile robot using two uncalibrated independently moving cameras. In: Proceedings of the 2005 IEEE International Conference on Robotics and Automation, ICRA 2005, pp. 2877–2883 (2005)
Schlegel, C., Illmann, J., Jaberg, H., Schuster, M., Wrz, R.: Vision based person tracking with a mobile robot. In: BMVC, pp. 1–10 (1998)
Benavidez, P., Jamshidi, M.: Mobile robot navigation and target tracking system. In: 2011 6th International Conference on System of Systems Engineering (SoSE), pp. 299–304 (2011)
Hu, C., Ma, X., Dai, X.: A robust person tracking and following approach for mobile robot. In: International Conference on Mechatronics and Automation, ICMA 2007, pp. 3571–3576 (2007)
Kim, J., Shim, D.H.: A vision-based target tracking control system of a quadrotor by using a tablet computer. In: 2013 International Conference on Unmanned Aircraft Systems (ICUAS), pp. 1165–1172 (2013)
Papachristos, C., Tzoumanikas, D., Alexis, K., Tzes, A.: Autonomous robotic aerial tracking, avoidance, and seeking of a mobile human subject. In: Bebis, G., et al. (eds.) ISVC 2015. LNCS, vol. 9474, pp. 444–454. Springer, Heidelberg (2015). doi:10.1007/978-3-319-27857-5_40
Woods, A.C., La, H.M.: Dynamic target tracking and obstacle avoidance using a drone. In: Bebis, G., et al. (eds.) ISVC 2015. LNCS, vol. 9474, pp. 857–866. Springer, Heidelberg (2015). doi:10.1007/978-3-319-27857-5_76
Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. IEEE Trans. Pattern Anal. Mach. Intell. 34, 1409–1422 (2012)
Ma, X., Hu, C., Dai, X., Qian, K.: Sensor integration for person tracking and following with mobile robot. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2008, pp. 3254–3259 (2008)
Clark, M., Feldpausch, D., Tewolde, G.S.: Microsoft kinect sensor for real-time color tracking robot. In: 2014 IEEE International Conference on Electro/Information Technology (EIT), pp. 416–421 (2014)
Teuliere, C., Eck, L., Marchand, E.: Chasing a moving target from a flying UAV. In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4929–4934 (2011)
Guerin, F., Fabri, S.G., Bugeja, M.K.: Double exponential smoothing for predictive vision based target tracking of a wheeled mobile robot. In: 2013 IEEE 52nd Annual Conference on Decision and Control (CDC), pp. 3535–3540 (2013)
Wang, W.J., Chang, J.W.: Implementation of a mobile robot for people following. In: 2012 International Conference on System Science and Engineering (ICSSE), pp. 112–116 (2012)
Pieropan, A., Bergström, N., Ishikawa, M., Kjellström, H.: Robust 3d tracking of unknown objects. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 2410–2417. IEEE (2015)
Babenko, B., Yang, M.H., Belongie, S.: Visual tracking with online multiple instance learning. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2009, pp. 983–990 (2009)
Rigatos, G.G.: Extended kalman and particle filtering for sensor fusion in motion control of mobile robots. Math. Comput. Simul. 81, 590–607 (2010)
Dinh, T.B., Yu, Q., Medioni, G.: Co-trained generative and discriminative trackers with cascade particle filter. Comput. Vis. Image Underst. 119, 41–56 (2014)
Medeiros, H., Park, J., Kak, A.: Distributed object tracking using a cluster-based kalman filter in wireless camera networks. IEEE J. Sel. Top. Sign. Proces. 2, 448–463 (2008)
Medeiros, H., Holguín, G., Shin, P.J., Park, J.: A parallel histogram-based particle filter for object tracking on SIMD-based smart cameras. Comput. Vis. Image Underst. 114, 1264–1272 (2010)
Yoon, Y., han Yun, W., Yoon, H., Kim, J.: Real-time visual target tracking in RGB-D data for person-following robots. In: 2014 22nd International Conference on Pattern Recognition (ICPR), pp. 2227–2232 (2014)
Shimura, K., Ando, Y., Yoshimi, T., Mizukawa, M.: Research on person following system based on RGB-D features by autonomous robot with multi-kinect sensor. In: 2014 IEEE/SICE International Symposium on System Integration (SII), pp. 304–309 (2014)
Nakamura, T.: Real-time 3-D object tracking using kinect sensor. In: 2011 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 784–788 (2011)
Chen, C.H., Cheng, C., Page, D., Koschan, A., Abidi, M.: A moving object tracked by a mobile robot with real-time obstacles avoidance capacity. In: 18th International Conference on Pattern Recognition, ICPR 2006, vol. 3, pp. 1091–1094. IEEE (2006)
Nebehay, G.: Robust object tracking based on tracking-learning-detection. Master’s thesis, TU Wien (2012)
Loy, G., Fletcher, L., Apostoloff, N., Zelinsky, A.: An adaptive fusion architecture for target tracking. In: Proceedings of Fifth IEEE International Conference on Automatic Face and Gesture Recognition, 2002, pp. 261–266. IEEE (2002)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Guevara, A.E., Hoak, A., Bernal, J.T., Medeiros, H. (2016). Vision-Based Self-contained Target Following Robot Using Bayesian Data Fusion. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2016. Lecture Notes in Computer Science(), vol 10072. Springer, Cham. https://doi.org/10.1007/978-3-319-50835-1_76
Download citation
DOI: https://doi.org/10.1007/978-3-319-50835-1_76
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-50834-4
Online ISBN: 978-3-319-50835-1
eBook Packages: Computer ScienceComputer Science (R0)