Abstract
Visual motion perception from a moving observer is the most often encountered case in real life situations. It is a complex and challenging problem, although, it can promote the arising of new applications. This article presents an innovative and autonomous robotic system designed for active surveillance and a dense optical flow technique. Several optical flow techniques have been proposed for motion perception however, most of them are too computationally demanding for autonomous mobile systems. The proposed HybridTree method is able to identify the intrinsic nature of the motion by performing two consecutive operations: expectation and sensing. Descriptive properties of the image are retrieved using a tree-based scheme and during the expectation phase. In the sensing operation, the properties of image regions are used by a hybrid and hierarchical optical flow structure to estimate the flow field. The experiments prove that the proposed method extracts reliable visual motion information in a short period of time and is more suitable for applications that do not have specialized computer devices. Therefore, the HybridTree differs from other techniques since it introduces a new perspective for the motion perception computation: high level information about the image sequence is integrated into the estimation of the optical flow. In addition, it meets most of the robotic or surveillance demands and the resulting flow field is less computationally demanding comparatively to other state-of-the-art methods.
Similar content being viewed by others
References
Bab-Hadiashar, A., Suter, D.: Robust optical flow computation. Int. J. Comput. Vis. 29(1), 59–77 (1998)
Baker, S., Scharstein, D., Lewis, J., Roth, S., Black, M., Szeliski, R.: A database and evaluation methodology for optical flow. Int. J. Comput. Vis. 92(1), 1–31 (2011)
Barron, J., Fleet, D., Beauchemin, S.: Performance of optical flow techniques. Int. J. Comput. Vis. 12(1), 43–77 (1994)
Barron, J., Klette, R.: Quantitative color optical flow. Int. Conf. Pattern Recog. 4(1), 251–255 (2002)
Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag New York, Inc., Secaucus (2006)
Black, M.J.: Robust Incremental Optical Flow. PhD thesis, Yale University, Department of Computer Science, New Haven, CT. PhD thesis in computer science (1992)
Black, M.J., Anandan, P.: Robust dynamic motion estimation over time. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 292–302 (1991)
Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: European Conference on Computer Vision (ECCV), vol. 4, pp. 25–36 (2004)
Bruhn, A., Weickert, J.: Towards ultimate motion estimation: combining highest accuracy with real-time performance. In: IEEE International Conference on Computer Vision (ICCV), vol. 1, pp. 749–755 (2005)
Bruhn, A., Weickert, J., Schnorr, C.: Lucas/kanade meets horn/schunck: combining local and global optic flow methods. Int. J. Comput. Vis. 61(3), 211–231 (2005)
Caballero, F., Merino, L., Ferruz, J., Ollero, A.: Unmanned aerial vehicle localization based on monocular vision and online mosaicking. J. Intell. Robot. Syst. 55(4–5), 323–343 (2009)
Campbell, S., Naeem, W., Irwin, G.W.: A review on improving the autonomy of unmanned surface vehicles through intelligent collision avoidance manoeuvres. Annu. Rev. Control 36(2), 267–283 (2012)
Ciliberto, C., Pattacini, U., Natale, L., Nori, F., Metta, G.: Reexamining lucas-kanade method for real-time independent motion detection: Application to the icub humanoid robot. In: International Conference on Intelligent Robots and Systems (IROS), pp. 4154–4160 (2011)
Conroy, J., Gremillion, G., Ranganathan, B., Sean Humbert, J.: Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton. Robot. 27(3), 189–198 (2009)
Denman, S., Fookes, C., Sridharan, S.: Improved simultaneous computation of motion detection and optical flow for object tracking. In: IEEE Digital Image Computing: Techniques and Applications (DICTA), pp. 175–182 (2009)
Di Paola, D., Milella, A., Cicirelli, G., Distante, A.: An autonomous mobile robotic system for surveillance of indoor environments. Int. J. Adv. Robot. Syst. 7(1), 19–26 (2010)
Fernández-Caballero, A., Castillo, J.C., Martínez-Cantos, J., Martínez-Tomás, R.: Optical flow or image subtraction in human detection from infrared camera on mobile robot. Robot. Auton. Syst. 58(12), 1273–1281 (2010)
Fleet, D., Jepson, A.: Computation of component image velocity from local phase information. Int. J. Comput. Vis. 5(1), 77–104 (1990)
Fleet, D.J., Weiss, Y.: Mathematical models for Computer Vision: The Handbook. N. Paragios, Y. Chen, and O. Faugeras (eds.). Springer-Verlag New York, Inc., Secaucus, NJ, USA. first edition edition (2005)
Golland, P., Bruckstein, M.: Motion from color. Comp. Vision Image Underst. 68(3), 346–3621 (1997)
Hill, P.R., Canagarajah, C.N., Bull, D.R.: Texture gradient based watershed segmentation. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 4, pp. 3381–3384 (2002)
Horn, B.K.P., Schunck, B.G.: Determining optical fow. Artif. Intell. 17(1), 185–203 (1981)
Nur Ibrahim, A.W., Ching, P.W., Gerald Seet, G.L., Michael Lau, W.S., Czajewski, W.: Moving Objects Detection and Tracking Framework for UAV-based Surveillance. In: 2010 Fourth Pacific-Rim Symposium on Image and Video Technology, pp. 456–461. IEEE (2010)
Kim, S., Choi, H., Yi, K., Choi, J., Kong, S.: Intelligent visual surveillance - a survey. Int. J. Control. Autom. Syst. 8(5), 926–939 (2010)
Laferté, J.M., Pérez, P., Heitz, F.: Discrete markov image modeling and inference on the quadtree. IEEE Trans. Image Process. 9(3), 390–404 (2000)
Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: In International Joint Conference on Artificial Intelligence (IJCAI), pp. 674–679 (1981)
Lui, W.L.D., Jarvis, R.: Eye-full tower: A gpu-based variable multibaseline omnidirectional stereovision system with automatic baseline selection for outdoor mobile robot navigation. Robot. Auton. Syst. 58(6), 747–761 (2010)
Mac, O., Humayun, A., Pollefeys, M., Brostow, G.: Learning a confidence measure for optical flow. IEEE Trans. Pattern Anal. Mach. Intell. 1(99), 1–14 (2012)
Martínez, C., Richardson, T., Thomas, P., du Bois, J.L., Campoy, P.: A vision-based strategy for autonomous aerial refueling tasks. Robot. Auton. Syst. 61(8), 876–895 (2013)
Murali, V., Birchfield, S., Murali, V.N., Birchfield, S.T.: Autonomous exploration using rapid perception of low-resolution image information. Auton. Robot. 32(2), 115–128 (2011)
Papanikolopoulos, N.P.: Selection of features and evaluation of visual measurements during robotic visual servoing tasks. J. Intell. Robot. Syst. 13(3), 279–304 (1995)
Pinto, A.M., Moreira, A.P., Costa, P.G., Correia, M.V.: Revisiting lucas-kanade and horn-schunck. J. Comput. Eng. Inform. (JCEI) 1(2), 23–29 (2013)
Pinto, A.M., Rocha, L., Moreira, A.: Object recognition using laser range finder and machine learning techniques. Robot. Comput. Integr. Manuf. 29(1), 12–22 (2013)
Sun, D., Roth, S., Black, M.J.: Secrets of optical flow estimation and their principles. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2432–2439 (2010)
Tagliasacchi, M.: A genetic algorithm for optical flow estimation. Image Vis. Comput. 25(2), 141–147 (2007)
Treptow, A., Cielniak, G., Duckett, T.: Real-time people tracking for mobile robots using thermal vision. Robot. Auton. Syst. 54(9), 729–739 (2006)
Ververidis, D., Kotropoulos, C.: Information loss of the mahalanobis distance in high dimensions: application to feature selection. IEEE Trans. Pattern Anal. Mach. Intell. 31(12), 2275–2281 (2009)
Wei, S.-G., Yang, L., Chen, Z., Liu, Z.-F.: Motion detection based on optical flow and self-adaptive threshold segmentation. Procedia Eng. 15, 3471–3476 (2011)
Wu, X., Gong, H., Chen, P., Zhong, Z., Xu, Y.: Surveillance robot utilizing video and audio information. J. Intell. Robot. Syst. 55(4), 403–421 (2009)
Xu, L., Jia, J., Matsushita, Y.: Motion detail preserving optical flow estimation. IEEE Trans. Pattern Anal. Mach. Intell. 34(9), 1744–1757 (2012)
Zimmer, H., Bruhn, A., Weickert, J., Valgaerts, L., Salgado, A., Rosenhahn, B., Seidel, H.P.: Complementary optic flow. In: International Conference on Energy Minimization Methods in Computer Vision and Pattern Recognition (EMMCVPR), pp. 207–220 (2009)
Zimmer, H., Bruhn, A., Weickert, J.: Optic flow in harmony. Int. J. Comput. Vis. 93(3), 368–388 (2011)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Pinto, A.M., Moreira, A.P., Correia, M.V. et al. A Flow-based Motion Perception Technique for an Autonomous Robot System. J Intell Robot Syst 75, 475–492 (2014). https://doi.org/10.1007/s10846-013-9999-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10846-013-9999-z