Multimedia Tools and Applications

, Volume 78, Issue 1, pp 877–896 | Cite as

Study of multiple moving targets’ detection in fisheye video based on the moving blob model

  • Jianhui Wu
  • Feng Huang
  • Wenjing Hu
  • Wei He
  • Bing Tu
  • Longyuan Guo
  • Xianfeng OuEmail author
  • Guoyun ZhangEmail author


This paper discussed some improved algorithms for multiple moving targets detection and tracking in fisheye video sequences which based on the moving blob model. The view field of fisheye lens achieved 183 degree which used in our system, so it has more effective in the no blind surveillance system. However, the fisheye image has a big distortion that makes it difficult to achieve an intelligent function. In this paper we try to establish a moving blob model to detect and track multiple moving targets in the fisheye video sequences, in order to achieve the automation and intelligent ability for no blind surveillance system. It is divided into three steps. Firstly, the distortion model of fisheye lens was established, we are discussing the character of the imaging principle of fisheye lens, and calculate the distortion coefficient which can be used in the moving blob model. Secondly, the principle of the moving blob model was analyzed in detail which based on the fisheye distortion model. It was included four main algorithms, which the first is the traditional algorithm of background extraction; and the background updating algorithm; the algorithm of the fisheye video sequence with the background subtracted in order to get the moving blobs; the algorithm of removing the shadow of blobs in RGB space. Thirdly, we determined that every extracted blob is a real moving target by calculating the pixels with a threshold, which can discard the faulty moving targets. Lastly, we designed the algorithm for tracking the moving targets based on the moving blobs selected through calculating the geometry center. The experiment indicated that every algorithm has a better processing efficiency of multiple moving targets in fisheye video sequences. Compared the traditional algorithm, the improved algorithm can be detected the moving target in a circular fisheye image effectively and stably.


Multiple moving targets fisheye system moving blob model algorithm design 



We should like to acknowledge that this work was supported in part by the Hunan Provincial Natural Science Foundation (2016JJ2064, 2017JJ3099), and the Open Fund of Education Department of Hunan Province(15 K051). The experiment of this work also was supported in part by the Research Fund of Science and Technology Program of Hunan Province (2016TP1021) and Fund of Education Department of Hunan Province(16C0723). Its contents are solely the responsibility of the authors and do not necessarily represent the official views. At the same time, we are thanks to provide lots of experimental support by the Research Center for Heterogeneous Computing and Applications, Hunan Institute of Science and Technology.


  1. 1.
    Bellas N, Chai S M, Dwyer M, et al. (2009) Real-time fisheye lens distortion correction using automatically generated streaming accelerators[C]//Field Programmable Custom Computing Machines. FCCM'09. 17th IEEE Symposium on. IEEE 149–156Google Scholar
  2. 2.
    Chahooki M, Charkari N (2016) Bridging the semantic gap for automatic image annotation by learning the manifold space. Comput Syst Sci Eng 31(1)Google Scholar
  3. 3.
    Comaniciu D, Ramesh V, Meer P (2000) Real-time tracking of non-rigid targets using Mean Shift. Comput Vis Patt Recog 02:142–149Google Scholar
  4. 4.
    Ding Y, Xu Z, Zhang Y, Sun K (2017) Fast lane detection based on bird's eye view and improved random sample consensus algorithm. Mult Tools Appl 76(21):22979–22998CrossRefGoogle Scholar
  5. 5.
    Du H, Liu Z, Song H, Mei L, Xu Z (2016) Improving RGBD Saliency Detection Using Progressive Region Classification and Saliency Fusion. IEEE Access 4:8987–8994CrossRefGoogle Scholar
  6. 6.
    Gennery DB (2006) Generalized Camera Calibration Including Fish-Eye Lenses. Int J Comput Vis 68(3):239–266CrossRefGoogle Scholar
  7. 7.
    Hughes C, Denny P, Glavin M et al (2010) Equidistant Fish-Eye Calibration and Rectification by Vanishing Point Extraction. IEEE Trans Pattern Anal Mach Intell 32(12):2289–2296CrossRefGoogle Scholar
  8. 8.
    Kannala J, Brandt SS (2006) A Generic Camera Model and Calibration Method for Conventional,Wide-Angle, and Fish-Eye Lense. IEEE Trans Pattern Anal Mach Intell 28(8):1335–1310CrossRefGoogle Scholar
  9. 9.
    Kweon G, Choi YH, Laikin M (2008) Fisheye lens for image processing applications[J]. J Opt Soc Korea 12(2):79–87CrossRefGoogle Scholar
  10. 10.
    Li J, Wang JZ, Wei S (2010) Moving object detection in framework of compressive sampling. J Syst Eng Electron 05:740–745CrossRefGoogle Scholar
  11. 11.
    Lin CC, Wang MS (2012) A vision based top-view transformation model for a vehicle parking assistant[J]. Sensors 12(4):4431–4446CrossRefGoogle Scholar
  12. 12.
    LIU B, ZHOU XH, ZHOU HQ (2004) Vehicle Detection and Recognition in Multi-traffic Scenes. J Univ Sci Technol China 34(5):599–606Google Scholar
  13. 13.
    Mitiche A, Mansouri AR (2004) On convergence of the Horn and Schunck optical-flow estimation method. IEEE Trans Image Process 13(6):848–852CrossRefGoogle Scholar
  14. 14.
    Nielsen F (2005) Surround video: a multihead camera approach[J]. Vis Comput 21(1–2):92–103CrossRefGoogle Scholar
  15. 15.
    Qian YQ, Xie QL (2010) Camshift and Kalman Predicting Based on Moving Target Tracking. Comput Eng Sci 21(8):81–83Google Scholar
  16. 16.
    Song HZ, Fu Y, Zhang L et al (2013) Cursor caging: enhancing focus targeting in interactive fisheye views. SCIENCE CHINA Inf Sci 5:115–130Google Scholar
  17. 17.
    Tu LF, Zhong SD, Peng Q (2014) Moving object detection method based on complementary multi resolution background models. J Cent South Univ 6:2306–2314CrossRefGoogle Scholar
  18. 18.
    Wedel A, Pock T, Zach C et al (2008) An improved algorithm for TV-L1 optical flow. Stat Geometric App Vis Motion Anal: Int Dagstuhl Seminar 7:23–45Google Scholar
  19. 19.
    Wei J, Li CF, Hu SM et al (2012) Fisheye video correction. Vis Comput Graph, IEEE Trans 18(10):1771–1783CrossRefGoogle Scholar
  20. 20.
    Wu J, Zhang G, Shuai Y et al (2014) Study the Moving Objects Extraction and Tracking Used the Moving Blobs Method in Fisheye Image[C]. Pattern Recognition: 6th Chinese Conference, CCPR. Changsha, China 484:255–265Google Scholar
  21. 21.
    Yang SX (2011) Fast-moving target tracking based on mean shift and frame-difference methods. J Syst Eng Electron 04:587–592Google Scholar
  22. 22.
    Ying XH, Hu ZY (2003) Fisheye Lense Distortion Correction Using Spherical Perspective Projection Constraint. Chin J Comput 26(12):1702–1708Google Scholar
  23. 23.
    Yuan X, Song YR, Wei XY (2011) Automatic surveillance system using fish-eye lens camera. Chin Opt Lett 2:37–41Google Scholar
  24. 24.
    Zhang ZK, Zhou JJ, Wang F et al (2011) Multiple-target tracking with adaptive sampling intervals for phased-array radar. J Syst Eng Electron 5:760–766CrossRefGoogle Scholar
  25. 25.
    Zhang B, Appia V, Pekkucuksen I, Batur A, Shastry P, Liu S, Sivasankaran S, Chitnis K, Liu Y (2014) A surround view camera solution for embedded systems,” in Computer Vision and Pattern Recognition Workshops (CVPRW), 2014 I.E. Conference on, 676–681Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  • Jianhui Wu
    • 1
    • 2
  • Feng Huang
    • 1
    • 2
  • Wenjing Hu
    • 1
    • 2
  • Wei He
    • 1
    • 2
  • Bing Tu
    • 1
    • 2
  • Longyuan Guo
    • 1
    • 2
  • Xianfeng Ou
    • 1
    • 2
    Email author
  • Guoyun Zhang
    • 1
    • 2
    Email author
  1. 1.Key Laboratory of Optimization and Control for Complex SystemsCollege of Information & communication EngineeringShanghaiChina
  2. 2.Research Center for Heterogeneous Computing and It’s ApplicationHunan Institute of Science &TechnologyYueyangChina

Personalised recommendations