Perception Quality Evaluation with Visual and Infrared Cameras in Challenging Environmental Conditions

Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 79)

Abstract

This work aims to contribute to the reliability and integrity of perceptual systems of unmanned ground vehicles (UGV). A method is proposed to evaluate the quality of sensor data prior to its use in a perception system by utilising a quality metric applied to heterogeneous sensor data such as visual and infrared camera images. The concept is illustrated specifically with sensor data that is evaluated prior to the use of the data in a standard SIFT feature extraction and matching technique. The method is then evaluated using various experimental data sets that were collected from a UGV in challenging environmental conditions, represented by the presence of airborne dust and smoke. In the first series of experiments, a motionless vehicle is observing a ‘reference’ scene, then the method is extended to the case of a moving vehicle by compensating for its motion. This paper shows that it is possible to anticipate degradation of a perception algorithm by evaluating the input data prior to any actual execution of the algorithm.

Keywords

Sensor Data Scale Invariant Feature Transform Infrared Camera Visual Odometry Unmanned Ground Vehicle 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Barron, J.L., Fleet, D.J., Beauchemin, S.S.: Performance of optical flow techniques. In: International Journal of Computer Vision (1994)Google Scholar
  2. 2.
    Brunner, C.J., Peynot, T.: Visual metrics for the evaluation of sensor data quality in outdoor perception. In: Performance Metrics for Intelligent Systems (PerMIS) Workshop (2010)Google Scholar
  3. 3.
    Brunner, C.J., Peynot, T., Underwood, J.: Towards discrimination of challenging conditions for UGVs with visual and infrared sensors. In: Australasian Conference on Robotics and Automation, ACRA (2009)Google Scholar
  4. 4.
    International Telecommunication Union (ITU-T). Subjective video quality assessment methods for multimedia applications (1999)Google Scholar
  5. 5.
    Kelly, et al.: Toward reliable off road autonomous vehicles operating in challenging environments. The International Journal of Robotics Research 25, 449–483 (2006)CrossRefGoogle Scholar
  6. 6.
    Leonard, et al.: A perception driven autonomous urban vehicle. Journal of Field Robotics 25(10), 727–774 (2008)CrossRefGoogle Scholar
  7. 7.
    Lowe, D.: Distinctive Image Features from Scale-Invariant Keypoints. International Journal of Computer Vision (2004)Google Scholar
  8. 8.
    Mackay, D.: Information Theory, Inference & Learning Algorithms. Cambridge University Press (2007)Google Scholar
  9. 9.
    Narasimhan, S.G., Nayar, S.K.: Shedding light on the weather. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 665–672 (2003)Google Scholar
  10. 10.
    Nayar, S.K., Narasimhan, S.G.: Vision in bad weather. In: Proceedings of the Seventh International Conference on Computer Vision, vol. 2, pp. 820–827 (1999)Google Scholar
  11. 11.
    Peynot, T., Scheding, S., Terho, S.: The Marulan Data Sets: Multi-Sensor Perception in Natural Environment with Challenging Conditions. International Journal of Robotics Research 29(13), 1602–1607 (2010)Google Scholar
  12. 12.
    Peynot, T., Underwood, J., Scheding, S.: Towards reliable perception for unmanned ground vehicles in challenging conditions. In: IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (2009)Google Scholar
  13. 13.
    Scaramuzza, D., Fraundorfer, F., Siegwart, R.: Real-Time Monocular Visual Odometry for On-Road Vehicles with 1-Point RANSAC. In: IEEE International Conference on Robotics and Automation (2009)Google Scholar
  14. 14.
    Tan, R.T.: Visibility in bad weather from a single image. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–8 (2008)Google Scholar
  15. 15.
    Urmson, et al.: Autonomous driving in urban environments: Boss and the urban challenge. Journal of Field Robotics 25(8), 425–466 (2008)CrossRefGoogle Scholar
  16. 16.
    Vedaldi, A., Fulkerson, B.: VLFeat: An Open and Portable Library of Computer Vision Algorithms (2008), http://www.vlfeat.org/
  17. 17.
    Wang, Z., Bovik, A.C.: Modern Image Quality Assessment. Morgan & Claypool (2006)Google Scholar
  18. 18.
    Winkler, S.: Digital Video Quality: Vision Models and Metrics. John Wiley & Sons Ltd, New Dehli (2005)CrossRefGoogle Scholar
  19. 19.
    Wyawahare, M.V., Patil, P.M., Abhyankar, H.K.: Image registration techniques: An overview. International Journal of Signal Processing, Image Processing and Pattern Recognition (2009)Google Scholar

Copyright information

© Springer-Verlag GmbH Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.ARC Centre of Excellence for Autonomous Systems, Australian Centre for Field Robotics (ACFR)The University of SydneySydneyAustralia

Personalised recommendations