Advertisement

ALRe: Outlier Detection for Guided Refinement

Conference paper
  • 669 Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12352)

Abstract

Guided refinement is a popular procedure of various image post-processing applications. It produces output image based on input and guided images. Input images are usually flawed estimates containing kinds of noises and outliers, which undermine the edge consistency between input and guided images. As improvements, they are refined into output images with similar intensities of input images and consistent edges of guided images. However, outliers are usually untraceable and simply treated as zero-mean noises, limiting the quality of such refinement. In this paper, we propose a general outlier detection method for guided refinement. We assume local linear relationship between output and guided images to express the expected edge consistency, based on which, the outlier likelihoods of input pixels are measured. The metric is termed as ALRe (anchored linear residual) since it is essentially the residual of local linear regression with an equality constraint exerted on the measured pixel. Valuable features of the ALRe are discussed. Its effectiveness is proven by applications and experiment.

Keywords

Anchored linear residual Outlier detection Guided refinement Local linear assumption Linear regression 

Notes

Acknowledgements

This work was supported in part by the National Key Research and Development Program of China under Grant 2020YFB1312800 and in part by the National Natural Science Foundation of China under Grant U1909206.

References

  1. 1.
    Levin, A., Lischinski, D., Weiss, Y.: A closed-form solution to natural image matting. IEEE Trans. Pattern Anal. Mach. Intell. 30(2), 228–242 (2008)CrossRefGoogle Scholar
  2. 2.
    Petschnigg, G., Szeliski, R., Agrawala, M., Cohen, M., Hoppe, H., Toyama, K.: Digital photography with flash and no-flash image pairs. ACM Trans. Graph. 23(3), 664–672 (2004)CrossRefGoogle Scholar
  3. 3.
    Liu, W., Chen, X., Yang, J., Wu, Q.: Robust color guided depth map restoration. IEEE Trans. Image Process. 26(1), 315–327 (2016)MathSciNetCrossRefGoogle Scholar
  4. 4.
    He, K., Sun, J., Tang, X.: Single image haze removal using dark channel prior. IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011)CrossRefGoogle Scholar
  5. 5.
    Hosni, A., Rhemann, C., Bleyer, M., Rother, C., Gelautz, M.: Fast cost-volume filtering for visual correspondence and beyond. IEEE Trans. Pattern Anal. Mach. Intell. 35(2), 504–511 (2012)CrossRefGoogle Scholar
  6. 6.
    Park, J., Kim, H., Tai, Y.W., Brown, M.S., Kweon, I.: High quality depth map upsampling for 3D-TOF cameras. In: International Conference on Computer Vision, pp. 1623–1630 (2011)Google Scholar
  7. 7.
    Yang, J., Ye, X., Li, K., Hou, C., Wang, Y.: Color-guided depth recovery from RGB-D data using an adaptive autoregressive model. IEEE Trans. Image Process. 23(8), 3443–3458 (2014)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Li, Z., Zheng, J., Zhu, Z., Yao, W., Wu, S.: Weighted guided image filtering. IEEE Trans. Image Process. 24(1), 120–129 (2014)MathSciNetzbMATHGoogle Scholar
  9. 9.
    Dai, L., Yuan, M., Zhang, F., Zhang, X.: Fully connected guided image filtering. In: IEEE International Conference on Computer Vision, pp. 352–360 (2015)Google Scholar
  10. 10.
    Ochotorena, C.N., Yamashita, Y.: Anisotropic guided filtering. IEEE Trans. Image Process. 29, 1397–1412 (2019)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Farbman, Z., Fattal, R., Lischinski, D., Szeliski, R.: Edge-preserving decompositions for multi-scale tone and detail manipulation. ACM Trans. Graph. 27(3), 1–10 (2008)CrossRefGoogle Scholar
  12. 12.
    He, K., Sun, J., Tang, X.: Guided image filtering. IEEE Trans. Pattern Anal. Mach. Intell. 35(6), 1397–1409 (2013)CrossRefGoogle Scholar
  13. 13.
    Berman, D., Avidan, S.: Non-local image dehazing. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1674–1682 (2016)Google Scholar
  14. 14.
    Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P.: Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13(4), 600–612 (2004)CrossRefGoogle Scholar
  15. 15.
    Ma, Z., He, K., Wei, Y., Sun, J., Wu, E.: Constant time weighted median filtering for stereo matching and beyond. In: IEEE International Conference on Computer Vision, pp. 49–56 (2013)Google Scholar
  16. 16.
    Fattal, R.: Dehazing using color-lines. ACM Trans. Graph. 34(1), 1–14 (2014)CrossRefGoogle Scholar
  17. 17.
    Guo, X., Li, Y., Ma, J., Ling, H.: Mutually guided image filtering. IEEE Trans. Pattern Anal. Mach. Intell. 42(3), 1283–1290 (2018)Google Scholar
  18. 18.
    Li, Y., Huang, J.B., Ahuja, N., Yang, M.H.: Joint image filtering with deep convolutional networks. IEEE Trans. Pattern Anal. Mach. Intell. 41(8), 1909–1923 (2019)CrossRefGoogle Scholar
  19. 19.
    Zhu, M., He, B., Liu, J., Yu, J.: Boosting dark channel dehazing via weighted local constant assumption. Sig. Process. 171, 107453 (2020)CrossRefGoogle Scholar
  20. 20.
    Scharstein, D., Szeliski, R.: High-accuracy stereo depth maps using structured light. In: IEEE Conference on Computer Vision and Pattern Recognition, p. I (2003)Google Scholar
  21. 21.
    Scharstein, D., et al.: High-resolution stereo datasets with subpixel-accurate ground truth. In: Jiang, X., Hornegger, J., Koch, R. (eds.) GCPR 2014. LNCS, vol. 8753, pp. 31–42. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-11752-2_3CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.BIC-ESAT, College of EngineeringPeking UniversityBeijingPeople’s Republic of China
  2. 2.Institute of Automation, Chinese Academy of SciencesBeijingPeople’s Republic of China
  3. 3.The Department of Mechanical EngineeringFuzhou UniversityFuzhouPeople’s Republic of China

Personalised recommendations