Skip to main content
Log in

Hierarchical architecture for motion and depth estimations based on color cues

  • Special Issue
  • Published:
Journal of Real-Time Image Processing Aims and scope Submit manuscript

Abstract

This work presents an FPGA implementation of a highly parallel architecture for the motion and disparity estimations of color images. Our system implements the well-known Lucas & Kanade algorithm with multi-scale extension for the computation of large displacements using color cues. We empirically fulfill the real-time requirements computing up to 32 and 36 frames per second for optical flow and disparity, respectively, with a 640 × 480 resolution. In this paper, we present our design technique based on fine pipelines, our architecture, and benchmarks of the different color-based alternatives analyzing the accuracy and resources utilization trade-off. We finally include some qualitative results and the resource utilization for our platform, concluding that we have obtained a system that manages a good trade-off between the increase in resources and the improvement in precision and the density of our results compared with other approaches described in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Andrews, R.J., Lovell, B.C.: Color optical flow. Proc. Workshop Digit. Image Comput. 1, 135–139 (2003)

    Google Scholar 

  2. Anguita, M., Diaz, J., Ros, E., Fernandez-Baldomero, F.J.: Optimization strategies for high-performance computing of optical-flow in general-purpose processors. IEEE Trans. Circuits Syst. Video Technol. 19(10), 1475–1488 (2009)

    Article  Google Scholar 

  3. Barranco, F., Tomasi, M., Diaz, J., Ros, E.: Hierarchical optical flow estimation architecture using color cues. In: Reconfigurable Computing: Architectures, Tools and Applications. LNCS, vol. 6578, pp. 269–274 (2011)

  4. Barranco, F., Tomasi, M., Diaz, J., Vanegas, M., Ros, E.: Parallel architecture for hierarchical optical flow estimation based on FPGA. IEEE Trans. VLSI Syst. 20(6): 1058–1067 (2012)

    Article  Google Scholar 

  5. Barron, J., Klette, R.: Quantitative color optical flow. Int. Conf. Pattern Recognit. 4: 251–255 (2002)

    Article  Google Scholar 

  6. Barron, J.L., Fleet, D.J., Beauchemin, S.S.: Performance of optical flow techniques. Int. J. Comput. Vis. 12: 43–77 (1994)

    Article  Google Scholar 

  7. Beauchemin, S.S., Barron, J.L.: The computation of optical flow. ACM Comput. Surv. 27(3): 433–466 (1995)

    Article  Google Scholar 

  8. Bergen, J., Anandan, P., Hanna, K., Hingorani, R.: Hierarchical model-based motion estimation. In: ECCV’92. LNCS. vol. 588, pp. 237–252 (1992)

  9. Botella, G., Garcia, A., Rodriguez-Alvarez, M., Ros, E., Meyer-Baese, U., Molina, M.C.: Robust bioinspired architecture for optical-flow computation. IEEE Trans. VLSI Syst. 18: 616–629 (2010)

    Article  Google Scholar 

  10. Bouguet, J.Y.: Camera calibration Toolbox for Matlab. http://www.vision.caltech.edu/bouguetj/ (2012)

  11. Brandt, J.W.: Improved accuracy in Gradient-Based optical flow estimation. Int. J. Comput. Vis. 25: 5–22 (1997)

    Article  Google Scholar 

  12. Burt, P.J., Edward, H., Adelson, E.H.: The Laplacian pyramid as a compact image code. IEEE Trans. Commun. 31: 532–540 (1983)

    Google Scholar 

  13. Calderon, H., Ortiz, J., Fontaine, J.: High parallel disparity map computing on FPGA. In: IEEE/ASME International Conference on Mechatronics and Embedded Systems and Applications (MESA), pp. 307–312 (2010)

  14. Chen, L., Jia, Y., Li, M.: An FPGA-based RGBD imager. Mach. Vis. Appl. 23(3): 513–525 (2012)

    Article  Google Scholar 

  15. Claus, C., Laika, A., Jia, L., Stechele, W.: High performance FPGA-based optical flow calculation using the census transformation. In: IEEE Intelligent Vehicles Symposium, pp. 1185–1190 (2009)

  16. Denman, S., Fookes, C., Sridharan, S.: Improved simultaneous computation of motion detection and optical flow for object tracking. In: Digital Image Computing: Techniques and Applications, pp. 175–182 (2009)

  17. Diaz, J., Ros, E., Agis, R., Bernier, J.: Superpipelined high-performance optical-flow computation architecture. Comput. Vis. Image Underst. 112(3): 262–273 (2008)

    Article  Google Scholar 

  18. Georgoulas, C., Andreadis, I.: A real-time occlusion aware hardware structure for disparity map computation. In: Image Analysis and Processing ICIAP 2009, vol. 5716, pp. 721–730. Springer, Berlin (2009)

  19. Gibson, J., Marques, O.: Stereo depth with a unified architecture GPU. In: Computer Vision and Pattern Recognition Workshops on CVGPU (2008)

  20. Golland, P., Bruckstein, A.M.: Motion from color. Comput. Vis. Image Underst. 68(3): 346–362 (1997)

    Article  Google Scholar 

  21. Gwosdek, P., Bruhn, A., Weickert, J.: Variational optic flow on the Sony PlayStation 3. J. Real-Time Image Process. 5: 163–177 (2010)

    Article  Google Scholar 

  22. Jordan, J.R. III, Bovik, A.C.: Using chromatic information in edge-based stereo correspondence. CVGIP Image Underst. 54: 98–118 (1991)

    Article  MATH  Google Scholar 

  23. Jordan, J.R. III, Bovik, A.C.: Using chromatic information in dense stereo correspondence. Pattern Recogn. 25(4): 367–383 (1992)

    Article  Google Scholar 

  24. Koschan, A., Rodehorst, V., Spiller. K.: Color stereo vision using hierarchical block matching and active color illumination. In: Conf. Pattern Recog. vol. 1, pp. 835–839 (1996)

  25. Krotosky, S., Trivedi. M.: Person surveillance using visual and infrared imagery. IEEE Trans. Circuits Syst. Video Technol. 18(8): 1096–1105 (2008)

    Google Scholar 

  26. Lei, C., Yang, Y.H.: Optical flow estimation on coarse-to-fine region-trees using discrete optimization. In: IEEE Int. Conf. Comp. Vis., pp. 1562–1569 (2009)

  27. Liu, H., Hong, T.H., Herman, M., Chellappa, R.: Accuracy vs. efficiency trade-offs in optical flow algorithms. In: ECCV 96, LNCS, vol. 1065, pp. 174–183 (1996)

  28. Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. Conf. Artif. Intell. 2: 674–679 (1981)

    Google Scholar 

  29. Markandey, V., Flinchbaugh, B.: Multispectral constraints for optical flow computation. In: Proc. Int. Conf. Comp. Vis. 38–41 (1990)

  30. Matsumoto, Y., Shibata, T., Sakai, K., Inaba, M., Inoue, H.: Real-time color stereo vision system for a mobile robot based on field multiplexing. In: IEEE International Conference on Robotics and Automation, vol.3, pp. 1934–1939 (1997)

  31. Middlebury (2011) Middlebury computer vision. http://vision.middlebury.edu/

  32. Mühlmann, K., Maier, D., Hesser, J., Männer, R.: Calculating dense disparity maps from color stereo images, an efficient implementation.Int. J. Comput. Vis. 47: 79–88 (2002)

    Article  MATH  Google Scholar 

  33. Negahdaripour, S., Madjidi, H.: Robust optical flow estimation using underwater color images. In: Proceedings on Oceans vol. 4, pp. 2309–2316 (2003)

  34. Ohta, N.: Optical flow detection by color images, pp. 801–805 (1989)

  35. OpenRT-Vision (2011) Open rt-vision project. http://code.google.com/p/open-rtvision/

  36. Ortigosa, E., Canas, A., Ros, E., Ortigosa, P., Mota, S., Diaz, J: Hardware description of multi-layer perceptrons with different abstraction levels. Microprocess. Microsyst. 30(7): 435–444 (2006)

    Article  Google Scholar 

  37. Ortiz, A.E., Neogi, N.: Color optic flow: a computer vision approach for object detection on UAVs. In: IEEE Digital Avionics Systems Conference, pp. 1–12 (2006)

  38. Muñoz Salinas, R., Aguirre, E., García-Silvente, M.: People detection and tracking using stereo vision and color. Image Vis. Comput. 25: 995–1007 (2007)

    Article  Google Scholar 

  39. SevenSolutions (2011) http://www.sevensols.com/

  40. Simoncelli, E.P.: Design of multi-dimensional derivative filters. In: IEEE International Conference on Image Processing, 1994. Proceedings. ICIP-94., vol. 1, pp. 790–794 (1994)

  41. Tomasi, M., Barranco, F., Vanegas, M., Diaz, J., Ros, E.: Fine grain pipeline architecture for high performance phase-based optical flow computation. J. Syst. Archit. 56(11): 577–587 (2010a)

    Article  Google Scholar 

  42. Tomasi, M., Vanegas, M., Barranco, F., Diaz, J., Ros, E.: High-Performance Optical-Flow architecture based on a multiscale, Multi-Orientation Phase-Based model. IEEE Trans. CSVT 20(12): 1797–1807 (2010b)

    Google Scholar 

  43. Tomasi, M., Vanegas, M., Barranco, F., Diaz, J., Ros, E.: Massive parallel-hardware architecture for multiscale stereo, optical flow and image-structure computation. IEEE Trans. CSVT 22(2): 282–294 (2012)

    Google Scholar 

  44. Vanegas, M., Tomasi, M., Diaz, J., Ros, E.: Multi-port abstraction layer for FPGA intensive memory exploitation applications. J. Syst. Architect. 56(9): 442–451 (2010)

    Article  Google Scholar 

  45. Villalpando, C., Morfopolous, A., Matthies, L., Goldberg, S.: FPGA implementation of stereo disparity with high throughput for mobility applications. In: IEEE Aerospace Conference (2011)

  46. Woodfill, J., Gordon, G., Buck, R.: Tyzx deepsea high speed stereo vision system. In: Conferecne on Computer Vision and Pattern Recognition (2004)

  47. Xilinx (2011) FPGA and CPLD solutions from xilinx, inc. http://www.xilinx.com/

Download references

Acknowledgments

This work has been supported by Spanish Grant (AP2007-00275), EU Project TOMSY (FP7-270436), Spanish Project ARC-VISION (TEC2010-15396), and Regional Projects MULTIVISION (TIC-3873), and ITREBA (TIC-5060).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francisco Barranco.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Barranco, F., Tomasi, M., Vanegas, M. et al. Hierarchical architecture for motion and depth estimations based on color cues. J Real-Time Image Proc 10, 435–452 (2015). https://doi.org/10.1007/s11554-012-0294-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11554-012-0294-1

Keywords

Navigation