Abstract
This work presents an FPGA implementation of a highly parallel architecture for the motion and disparity estimations of color images. Our system implements the well-known Lucas & Kanade algorithm with multi-scale extension for the computation of large displacements using color cues. We empirically fulfill the real-time requirements computing up to 32 and 36 frames per second for optical flow and disparity, respectively, with a 640 × 480 resolution. In this paper, we present our design technique based on fine pipelines, our architecture, and benchmarks of the different color-based alternatives analyzing the accuracy and resources utilization trade-off. We finally include some qualitative results and the resource utilization for our platform, concluding that we have obtained a system that manages a good trade-off between the increase in resources and the improvement in precision and the density of our results compared with other approaches described in the literature.
Similar content being viewed by others
References
Andrews, R.J., Lovell, B.C.: Color optical flow. Proc. Workshop Digit. Image Comput. 1, 135–139 (2003)
Anguita, M., Diaz, J., Ros, E., Fernandez-Baldomero, F.J.: Optimization strategies for high-performance computing of optical-flow in general-purpose processors. IEEE Trans. Circuits Syst. Video Technol. 19(10), 1475–1488 (2009)
Barranco, F., Tomasi, M., Diaz, J., Ros, E.: Hierarchical optical flow estimation architecture using color cues. In: Reconfigurable Computing: Architectures, Tools and Applications. LNCS, vol. 6578, pp. 269–274 (2011)
Barranco, F., Tomasi, M., Diaz, J., Vanegas, M., Ros, E.: Parallel architecture for hierarchical optical flow estimation based on FPGA. IEEE Trans. VLSI Syst. 20(6): 1058–1067 (2012)
Barron, J., Klette, R.: Quantitative color optical flow. Int. Conf. Pattern Recognit. 4: 251–255 (2002)
Barron, J.L., Fleet, D.J., Beauchemin, S.S.: Performance of optical flow techniques. Int. J. Comput. Vis. 12: 43–77 (1994)
Beauchemin, S.S., Barron, J.L.: The computation of optical flow. ACM Comput. Surv. 27(3): 433–466 (1995)
Bergen, J., Anandan, P., Hanna, K., Hingorani, R.: Hierarchical model-based motion estimation. In: ECCV’92. LNCS. vol. 588, pp. 237–252 (1992)
Botella, G., Garcia, A., Rodriguez-Alvarez, M., Ros, E., Meyer-Baese, U., Molina, M.C.: Robust bioinspired architecture for optical-flow computation. IEEE Trans. VLSI Syst. 18: 616–629 (2010)
Bouguet, J.Y.: Camera calibration Toolbox for Matlab. http://www.vision.caltech.edu/bouguetj/ (2012)
Brandt, J.W.: Improved accuracy in Gradient-Based optical flow estimation. Int. J. Comput. Vis. 25: 5–22 (1997)
Burt, P.J., Edward, H., Adelson, E.H.: The Laplacian pyramid as a compact image code. IEEE Trans. Commun. 31: 532–540 (1983)
Calderon, H., Ortiz, J., Fontaine, J.: High parallel disparity map computing on FPGA. In: IEEE/ASME International Conference on Mechatronics and Embedded Systems and Applications (MESA), pp. 307–312 (2010)
Chen, L., Jia, Y., Li, M.: An FPGA-based RGBD imager. Mach. Vis. Appl. 23(3): 513–525 (2012)
Claus, C., Laika, A., Jia, L., Stechele, W.: High performance FPGA-based optical flow calculation using the census transformation. In: IEEE Intelligent Vehicles Symposium, pp. 1185–1190 (2009)
Denman, S., Fookes, C., Sridharan, S.: Improved simultaneous computation of motion detection and optical flow for object tracking. In: Digital Image Computing: Techniques and Applications, pp. 175–182 (2009)
Diaz, J., Ros, E., Agis, R., Bernier, J.: Superpipelined high-performance optical-flow computation architecture. Comput. Vis. Image Underst. 112(3): 262–273 (2008)
Georgoulas, C., Andreadis, I.: A real-time occlusion aware hardware structure for disparity map computation. In: Image Analysis and Processing ICIAP 2009, vol. 5716, pp. 721–730. Springer, Berlin (2009)
Gibson, J., Marques, O.: Stereo depth with a unified architecture GPU. In: Computer Vision and Pattern Recognition Workshops on CVGPU (2008)
Golland, P., Bruckstein, A.M.: Motion from color. Comput. Vis. Image Underst. 68(3): 346–362 (1997)
Gwosdek, P., Bruhn, A., Weickert, J.: Variational optic flow on the Sony PlayStation 3. J. Real-Time Image Process. 5: 163–177 (2010)
Jordan, J.R. III, Bovik, A.C.: Using chromatic information in edge-based stereo correspondence. CVGIP Image Underst. 54: 98–118 (1991)
Jordan, J.R. III, Bovik, A.C.: Using chromatic information in dense stereo correspondence. Pattern Recogn. 25(4): 367–383 (1992)
Koschan, A., Rodehorst, V., Spiller. K.: Color stereo vision using hierarchical block matching and active color illumination. In: Conf. Pattern Recog. vol. 1, pp. 835–839 (1996)
Krotosky, S., Trivedi. M.: Person surveillance using visual and infrared imagery. IEEE Trans. Circuits Syst. Video Technol. 18(8): 1096–1105 (2008)
Lei, C., Yang, Y.H.: Optical flow estimation on coarse-to-fine region-trees using discrete optimization. In: IEEE Int. Conf. Comp. Vis., pp. 1562–1569 (2009)
Liu, H., Hong, T.H., Herman, M., Chellappa, R.: Accuracy vs. efficiency trade-offs in optical flow algorithms. In: ECCV 96, LNCS, vol. 1065, pp. 174–183 (1996)
Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. Conf. Artif. Intell. 2: 674–679 (1981)
Markandey, V., Flinchbaugh, B.: Multispectral constraints for optical flow computation. In: Proc. Int. Conf. Comp. Vis. 38–41 (1990)
Matsumoto, Y., Shibata, T., Sakai, K., Inaba, M., Inoue, H.: Real-time color stereo vision system for a mobile robot based on field multiplexing. In: IEEE International Conference on Robotics and Automation, vol.3, pp. 1934–1939 (1997)
Middlebury (2011) Middlebury computer vision. http://vision.middlebury.edu/
Mühlmann, K., Maier, D., Hesser, J., Männer, R.: Calculating dense disparity maps from color stereo images, an efficient implementation.Int. J. Comput. Vis. 47: 79–88 (2002)
Negahdaripour, S., Madjidi, H.: Robust optical flow estimation using underwater color images. In: Proceedings on Oceans vol. 4, pp. 2309–2316 (2003)
Ohta, N.: Optical flow detection by color images, pp. 801–805 (1989)
OpenRT-Vision (2011) Open rt-vision project. http://code.google.com/p/open-rtvision/
Ortigosa, E., Canas, A., Ros, E., Ortigosa, P., Mota, S., Diaz, J: Hardware description of multi-layer perceptrons with different abstraction levels. Microprocess. Microsyst. 30(7): 435–444 (2006)
Ortiz, A.E., Neogi, N.: Color optic flow: a computer vision approach for object detection on UAVs. In: IEEE Digital Avionics Systems Conference, pp. 1–12 (2006)
Muñoz Salinas, R., Aguirre, E., García-Silvente, M.: People detection and tracking using stereo vision and color. Image Vis. Comput. 25: 995–1007 (2007)
SevenSolutions (2011) http://www.sevensols.com/
Simoncelli, E.P.: Design of multi-dimensional derivative filters. In: IEEE International Conference on Image Processing, 1994. Proceedings. ICIP-94., vol. 1, pp. 790–794 (1994)
Tomasi, M., Barranco, F., Vanegas, M., Diaz, J., Ros, E.: Fine grain pipeline architecture for high performance phase-based optical flow computation. J. Syst. Archit. 56(11): 577–587 (2010a)
Tomasi, M., Vanegas, M., Barranco, F., Diaz, J., Ros, E.: High-Performance Optical-Flow architecture based on a multiscale, Multi-Orientation Phase-Based model. IEEE Trans. CSVT 20(12): 1797–1807 (2010b)
Tomasi, M., Vanegas, M., Barranco, F., Diaz, J., Ros, E.: Massive parallel-hardware architecture for multiscale stereo, optical flow and image-structure computation. IEEE Trans. CSVT 22(2): 282–294 (2012)
Vanegas, M., Tomasi, M., Diaz, J., Ros, E.: Multi-port abstraction layer for FPGA intensive memory exploitation applications. J. Syst. Architect. 56(9): 442–451 (2010)
Villalpando, C., Morfopolous, A., Matthies, L., Goldberg, S.: FPGA implementation of stereo disparity with high throughput for mobility applications. In: IEEE Aerospace Conference (2011)
Woodfill, J., Gordon, G., Buck, R.: Tyzx deepsea high speed stereo vision system. In: Conferecne on Computer Vision and Pattern Recognition (2004)
Xilinx (2011) FPGA and CPLD solutions from xilinx, inc. http://www.xilinx.com/
Acknowledgments
This work has been supported by Spanish Grant (AP2007-00275), EU Project TOMSY (FP7-270436), Spanish Project ARC-VISION (TEC2010-15396), and Regional Projects MULTIVISION (TIC-3873), and ITREBA (TIC-5060).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Barranco, F., Tomasi, M., Vanegas, M. et al. Hierarchical architecture for motion and depth estimations based on color cues. J Real-Time Image Proc 10, 435–452 (2015). https://doi.org/10.1007/s11554-012-0294-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11554-012-0294-1