Machine Vision and Applications

, Volume 23, Issue 3, pp 403–415 | Cite as

An on-board vision sensor system for small unmanned vehicle applications

  • Beau J. Tippetts
  • Dah-Jye Lee
  • James K. Archibald
Original Paper

Abstract

This paper describes an on-board vision sensor system that is developed specifically for small unmanned vehicle applications. For small vehicles, vision sensors have many advantages, including size, weight, and power consumption, over other sensors such as radar, sonar, and laser range finder, etc. A vision sensor is also uniquely suited for tasks such as target tracking and recognition that require visual information processing. However, it is difficult to meet the computing needs of real-time vision processing on a small robot. In this paper, we present the development of a field programmable gate array-based vision sensor and use a small ground vehicle to demonstrate that this vision sensor is able to detect and track features on a user-selected target from frame to frame and steer the small autonomous vehicle towards it. The sensor system utilizes hardware implementations of the rank transform for filtering, a Harris corner detector for feature detection, and a correlation algorithm for feature matching and tracking. With additional capabilities supported in software, the operational system communicates wirelessly with a base station, receiving commands, providing visual feedback to the user and allowing user input such as specifying targets to track. Since this vision sensor system uses reconfigurable hardware, other vision algorithms such as stereo vision and motion analysis can be implemented to reconfigure the system for other real-time vision applications.

Keywords

FPGA Real-time vision Small autonomous vehicles Target tracking Motion analysis Pattern matching 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Garcia, R., Valavanis, K., Kontitsis, M.: A multiplatform on-board processing system for miniature unmanned vehicles. In: Proceedings of the 2006 IEEE International Conference on Robotics and Automation, ICRA 2006, 15–19, pp. 2156–2163 (2006)Google Scholar
  2. 2.
    “Power consumption and the modern geek”, 8 2010 [Online]. Available: http://www.extremetech.com/-article2/-0,2845,1937997,00.asp
  3. 3.
    Naish P., Bishop P.: Designing ASICS. Halsted Press, New York (1988)Google Scholar
  4. 4.
    Anderson, J. D., Lee, D.-J., Edwards, B., Archibald, J., Greco, C. R.: Real-time feature tracking on an embedded vision sensor for small vision-guided unmanned vehicles. In: Computational Intelligence in Robotics and Automation, 2007. CIRA 2007. International Symposium on, 20–23, pp. 55–60 (2007)Google Scholar
  5. 5.
    Zabih, R., Woodfill, J.: Non-parametric local transforms for computing visual correspondence. pp. 151–158. Springer (1994)Google Scholar
  6. 6.
    Konolige, K.: Small vision system: Hardware and implementation. In: Eighth International Symposium on Robotics Research (1997)Google Scholar
  7. 7.
    Rock, S., Frew, E., Jones, H., LeMaster, E., Woodley, B.: “Combined cdgps and vision-based control of a small autonomous helicopter. In: American Control Conference, 1998. Proceedings of the 1998, vol. 2, 21–26, pp. 694–698 (1998)Google Scholar
  8. 8.
    Ettinger S., Nechyba M., Ifju P., Waszak M.: Vision-guided flight stability and control for micro air vehicles. Adv. Robot. 17(7), 617–640 (2003)CrossRefGoogle Scholar
  9. 9.
    Hrabar, S., Sukhatme, G.: Omnidirectional vision for an autonomous helicopter. In: Proceedings of the 2003 IEEE International Conference on Robotics and Automation. ICRA ’03 vol. 1, 14–19, pp. 558–563 (2003)Google Scholar
  10. 10.
    Frew, E., McGee, T., Kim, Z., Xiao, X., Jackson, S., Morimoto, M., Rathinam, S., Padial, J., Sengupta, R.: Vision-based road-following using a small autonomous aircraft. In: Aerospace Conference, 2004. Proceedings. 2004 IEEE, vol. 5, 6–13, pp. 3006–3015 (2004)Google Scholar
  11. 11.
    Cornall, T., Egan, G.: Measuring horizon angle from video on a small unmanned air vehicle. In: 2nd International Conference on Autonomous Robots and Agents. Citeseer, pp. 339–344 (2004)Google Scholar
  12. 12.
    Dobrokhodov, V., Kaminer, I., Jones, K., Ghabcheloo, R.: Vision-based tracking and motion estimation for moving targets using small uavs. In: American Control Conference, 2006, 14–16 (2006)Google Scholar
  13. 13.
    Proctor A., Johnson E., Apker T.: Vision-only control and guidance for aircraft. J. Field Robot. 23(10), 863–890 (2006)MATHCrossRefGoogle Scholar
  14. 14.
    MacLean, W.: An evaluation of the suitability of fpgas for embedded vision systems. In: Computer Vision and Pattern Recognition—Workshops, 2005. IEEE Computer Society Conference on CVPR Workshops, 25–25, pp. 131–131 (2005)Google Scholar
  15. 15.
    Cardon, D., Fife, W., Archibald, J., Lee, D.: Fast 3d reconstruction for small autonomous robots. In: Industrial Electronics Society, 2005. IECON 2005. 31st Annual Conference of IEEE, 6–10 (2005)Google Scholar
  16. 16.
    Torres-Huitzil, C., Arias-Estrada, M.: An fpga architecture for high speed edge and corner detection. In: CAMP ’00: Proceedings of the Fifth IEEE International Workshop on Computer Architectures for Machine Perception (CAMP’00), p. 112. IEEE Computer Society, Washington, D.C. (2000)Google Scholar
  17. 17.
    Darabiha, A., Rose, J., MacLean, W.: Video-rate stereo depth measurement on programmable hardware In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 203–210 (2003)Google Scholar
  18. 18.
    Se S., Jasiobedzki P.: Stereo-vision based 3d modeling and localization for unmanned vehicles. Int. J. Intell. Control Syst. 13(1), 47–58 (2008)Google Scholar
  19. 19.
    Aubépart, F., Franceschini, N.: Bio-inspired optic flow sensors based on fpga: Application to micro-air-vehicles. Microprocess. Microsyst. 31(6), 408–419 (2007), special Issue on Sensor Systems. [Online]. Available: http://www.sciencedirect.com/-science/-article/-B6V0X-4N4YMM2-2/-2/-58a7ddaeb91047ed991d37c199583481
  20. 20.
    Song, T.L., Lee, D.G.: A probabilistic nearest neighbor filter algorithm for m validated measurements. In: IEEE Transactions on Signal Processing, vol. 54, no. 7, pp. 2797–2802, July 2006Google Scholar
  21. 21.
    Tissainayagam P., Suter D.: Assessing the performance of corner detectors for point feature tracking applications. Image Vis. Comput. 22(8), 663–679 (2004)CrossRefGoogle Scholar
  22. 22.
    Tommasini, T., Fusiello, A., Trucco, E., Roberto, V.: Making good features track better. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Citeseer, pp. 178–183 (1998)Google Scholar
  23. 23.
    Nickels, K., Hutchinson, S.: Measurement error estimation for feature tracking. In: Proceedings of the International Conference on Robotics and Automation, vol. 4, pp. 3230–3235 (1999)Google Scholar
  24. 24.
    Fife W.S., Archibald J.K.: Reconfigurable on-board vision processing for small autonomous vehicles. EURASIP J. Embed. Syst. 2007(1), 33–33 (2007)CrossRefGoogle Scholar
  25. 25.
    Harris, C., Stephens, M.: A combined corner and edge detector. In: 4th Alvey Vision Conference, vol. 15, p. 50. Manchester (1988)Google Scholar
  26. 26.
    Zabih, R., Woodfill, J.: Non-parametric Local Transforms for Computing Visual Correspondence. pp. 151–158. ECCV (2004)Google Scholar

Copyright information

© Springer-Verlag 2012

Authors and Affiliations

  • Beau J. Tippetts
    • 1
  • Dah-Jye Lee
    • 1
  • James K. Archibald
    • 1
  1. 1.Electrical and Computer Engineering DepartmentBrigham Young UniversityProvoUSA

Personalised recommendations