Skip to main content
Log in

A Flow-based Motion Perception Technique for an Autonomous Robot System

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

Visual motion perception from a moving observer is the most often encountered case in real life situations. It is a complex and challenging problem, although, it can promote the arising of new applications. This article presents an innovative and autonomous robotic system designed for active surveillance and a dense optical flow technique. Several optical flow techniques have been proposed for motion perception however, most of them are too computationally demanding for autonomous mobile systems. The proposed HybridTree method is able to identify the intrinsic nature of the motion by performing two consecutive operations: expectation and sensing. Descriptive properties of the image are retrieved using a tree-based scheme and during the expectation phase. In the sensing operation, the properties of image regions are used by a hybrid and hierarchical optical flow structure to estimate the flow field. The experiments prove that the proposed method extracts reliable visual motion information in a short period of time and is more suitable for applications that do not have specialized computer devices. Therefore, the HybridTree differs from other techniques since it introduces a new perspective for the motion perception computation: high level information about the image sequence is integrated into the estimation of the optical flow. In addition, it meets most of the robotic or surveillance demands and the resulting flow field is less computationally demanding comparatively to other state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bab-Hadiashar, A., Suter, D.: Robust optical flow computation. Int. J. Comput. Vis. 29(1), 59–77 (1998)

    Article  Google Scholar 

  2. Baker, S., Scharstein, D., Lewis, J., Roth, S., Black, M., Szeliski, R.: A database and evaluation methodology for optical flow. Int. J. Comput. Vis. 92(1), 1–31 (2011)

    Article  Google Scholar 

  3. Barron, J., Fleet, D., Beauchemin, S.: Performance of optical flow techniques. Int. J. Comput. Vis. 12(1), 43–77 (1994)

    Article  Google Scholar 

  4. Barron, J., Klette, R.: Quantitative color optical flow. Int. Conf. Pattern Recog. 4(1), 251–255 (2002)

    Article  Google Scholar 

  5. Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag New York, Inc., Secaucus (2006)

    Google Scholar 

  6. Black, M.J.: Robust Incremental Optical Flow. PhD thesis, Yale University, Department of Computer Science, New Haven, CT. PhD thesis in computer science (1992)

  7. Black, M.J., Anandan, P.: Robust dynamic motion estimation over time. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 292–302 (1991)

  8. Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: European Conference on Computer Vision (ECCV), vol. 4, pp. 25–36 (2004)

  9. Bruhn, A., Weickert, J.: Towards ultimate motion estimation: combining highest accuracy with real-time performance. In: IEEE International Conference on Computer Vision (ICCV), vol. 1, pp. 749–755 (2005)

  10. Bruhn, A., Weickert, J., Schnorr, C.: Lucas/kanade meets horn/schunck: combining local and global optic flow methods. Int. J. Comput. Vis. 61(3), 211–231 (2005)

    Google Scholar 

  11. Caballero, F., Merino, L., Ferruz, J., Ollero, A.: Unmanned aerial vehicle localization based on monocular vision and online mosaicking. J. Intell. Robot. Syst. 55(4–5), 323–343 (2009)

    Article  MATH  Google Scholar 

  12. Campbell, S., Naeem, W., Irwin, G.W.: A review on improving the autonomy of unmanned surface vehicles through intelligent collision avoidance manoeuvres. Annu. Rev. Control 36(2), 267–283 (2012)

    Article  Google Scholar 

  13. Ciliberto, C., Pattacini, U., Natale, L., Nori, F., Metta, G.: Reexamining lucas-kanade method for real-time independent motion detection: Application to the icub humanoid robot. In: International Conference on Intelligent Robots and Systems (IROS), pp. 4154–4160 (2011)

  14. Conroy, J., Gremillion, G., Ranganathan, B., Sean Humbert, J.: Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton. Robot. 27(3), 189–198 (2009)

    Article  Google Scholar 

  15. Denman, S., Fookes, C., Sridharan, S.: Improved simultaneous computation of motion detection and optical flow for object tracking. In: IEEE Digital Image Computing: Techniques and Applications (DICTA), pp. 175–182 (2009)

  16. Di Paola, D., Milella, A., Cicirelli, G., Distante, A.: An autonomous mobile robotic system for surveillance of indoor environments. Int. J. Adv. Robot. Syst. 7(1), 19–26 (2010)

  17. Fernández-Caballero, A., Castillo, J.C., Martínez-Cantos, J., Martínez-Tomás, R.: Optical flow or image subtraction in human detection from infrared camera on mobile robot. Robot. Auton. Syst. 58(12), 1273–1281 (2010)

    Google Scholar 

  18. Fleet, D., Jepson, A.: Computation of component image velocity from local phase information. Int. J. Comput. Vis. 5(1), 77–104 (1990)

    Google Scholar 

  19. Fleet, D.J., Weiss, Y.: Mathematical models for Computer Vision: The Handbook. N. Paragios, Y. Chen, and O. Faugeras (eds.). Springer-Verlag New York, Inc., Secaucus, NJ, USA. first edition edition (2005)

  20. Golland, P., Bruckstein, M.: Motion from color. Comp. Vision Image Underst. 68(3), 346–3621 (1997)

    Google Scholar 

  21. Hill, P.R., Canagarajah, C.N., Bull, D.R.: Texture gradient based watershed segmentation. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 4, pp. 3381–3384 (2002)

  22. Horn, B.K.P., Schunck, B.G.: Determining optical fow. Artif. Intell. 17(1), 185–203 (1981)

    Google Scholar 

  23. Nur Ibrahim, A.W., Ching, P.W., Gerald Seet, G.L., Michael Lau, W.S., Czajewski, W.: Moving Objects Detection and Tracking Framework for UAV-based Surveillance. In: 2010 Fourth Pacific-Rim Symposium on Image and Video Technology, pp. 456–461. IEEE (2010)

  24. Kim, S., Choi, H., Yi, K., Choi, J., Kong, S.: Intelligent visual surveillance - a survey. Int. J. Control. Autom. Syst. 8(5), 926–939 (2010)

    Google Scholar 

  25. Laferté, J.M., Pérez, P., Heitz, F.: Discrete markov image modeling and inference on the quadtree. IEEE Trans. Image Process. 9(3), 390–404 (2000)

    MATH  MathSciNet  Google Scholar 

  26. Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: In International Joint Conference on Artificial Intelligence (IJCAI), pp. 674–679 (1981)

  27. Lui, W.L.D., Jarvis, R.: Eye-full tower: A gpu-based variable multibaseline omnidirectional stereovision system with automatic baseline selection for outdoor mobile robot navigation. Robot. Auton. Syst. 58(6), 747–761 (2010)

    Google Scholar 

  28. Mac, O., Humayun, A., Pollefeys, M., Brostow, G.: Learning a confidence measure for optical flow. IEEE Trans. Pattern Anal. Mach. Intell. 1(99), 1–14 (2012)

    Google Scholar 

  29. Martínez, C., Richardson, T., Thomas, P., du Bois, J.L., Campoy, P.: A vision-based strategy for autonomous aerial refueling tasks. Robot. Auton. Syst. 61(8), 876–895 (2013)

    Google Scholar 

  30. Murali, V., Birchfield, S., Murali, V.N., Birchfield, S.T.: Autonomous exploration using rapid perception of low-resolution image information. Auton. Robot. 32(2), 115–128 (2011)

    Google Scholar 

  31. Papanikolopoulos, N.P.: Selection of features and evaluation of visual measurements during robotic visual servoing tasks. J. Intell. Robot. Syst. 13(3), 279–304 (1995)

    Google Scholar 

  32. Pinto, A.M., Moreira, A.P., Costa, P.G., Correia, M.V.: Revisiting lucas-kanade and horn-schunck. J. Comput. Eng. Inform. (JCEI) 1(2), 23–29 (2013)

    Google Scholar 

  33. Pinto, A.M., Rocha, L., Moreira, A.: Object recognition using laser range finder and machine learning techniques. Robot. Comput. Integr. Manuf. 29(1), 12–22 (2013)

    Google Scholar 

  34. Sun, D., Roth, S., Black, M.J.: Secrets of optical flow estimation and their principles. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2432–2439 (2010)

  35. Tagliasacchi, M.: A genetic algorithm for optical flow estimation. Image Vis. Comput. 25(2), 141–147 (2007)

    Google Scholar 

  36. Treptow, A., Cielniak, G., Duckett, T.: Real-time people tracking for mobile robots using thermal vision. Robot. Auton. Syst. 54(9), 729–739 (2006)

    Google Scholar 

  37. Ververidis, D., Kotropoulos, C.: Information loss of the mahalanobis distance in high dimensions: application to feature selection. IEEE Trans. Pattern Anal. Mach. Intell. 31(12), 2275–2281 (2009)

    Google Scholar 

  38. Wei, S.-G., Yang, L., Chen, Z., Liu, Z.-F.: Motion detection based on optical flow and self-adaptive threshold segmentation. Procedia Eng. 15, 3471–3476 (2011)

    Google Scholar 

  39. Wu, X., Gong, H., Chen, P., Zhong, Z., Xu, Y.: Surveillance robot utilizing video and audio information. J. Intell. Robot. Syst. 55(4), 403–421 (2009)

    MATH  Google Scholar 

  40. Xu, L., Jia, J., Matsushita, Y.: Motion detail preserving optical flow estimation. IEEE Trans. Pattern Anal. Mach. Intell. 34(9), 1744–1757 (2012)

    Google Scholar 

  41. Zimmer, H., Bruhn, A., Weickert, J., Valgaerts, L., Salgado, A., Rosenhahn, B., Seidel, H.P.: Complementary optic flow. In: International Conference on Energy Minimization Methods in Computer Vision and Pattern Recognition (EMMCVPR), pp. 207–220 (2009)

  42. Zimmer, H., Bruhn, A., Weickert, J.: Optic flow in harmony. Int. J. Comput. Vis. 93(3), 368–388 (2011)

    MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andry Maykol Pinto.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Pinto, A.M., Moreira, A.P., Correia, M.V. et al. A Flow-based Motion Perception Technique for an Autonomous Robot System. J Intell Robot Syst 75, 475–492 (2014). https://doi.org/10.1007/s10846-013-9999-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-013-9999-z

Keywords

Navigation