Skip to main content
Log in

Color transfer method based on saliency features for color images

  • Regular Paper
  • Published:
Optical Review Aims and scope Submit manuscript

Abstract

With growing demands for higher image quality in the fields of film, video post-production, image restoration, art creation, and computer vision, color transfer between images has become an important research area. Based on previous research on color transfer techniques, this paper proposes a color transfer method for images based on saliency features, aiming at automatic color migration between them. Transferring colors based on the saliency features of the input image can avoid the problem of unnatural color of the output image due to mixing of colors from different regions. First, the local variances of both the original and reference images are calculated, serving as a temporary saliency feature map. This is followed by obtaining a refined saliency feature map after undergoing processes such as minimization filtering, binarization, expansion, and iteration. Subsequently, color is transferred between the saliency and non-saliency regions of the original and reference images. To avoid the generation of pseudo-contours, the image is then refined using base projection. Finally, an output image is obtained by fusing the base-projected image with the outcome from Reinhard’s method, ensuring the output retains its naturalness and consistency. We conducted experiments with different types of images such as natural landscapes, buildings, and art paintings. The experimental results show that the method proposed in this paper not only retains the intricacies of the original image but also offers fuller and more realistic color renditions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

Data availability

The code, data, and materials used or analysed during the current study are available from the corresponding author on reasonable request.

References

  1. Xie, B., Zhang, K., Liu, K.: Color transfer based on color classification. J. Phys.: Conf. Ser. 1601, 032028 (2020)

    Google Scholar 

  2. Reinhard, E., Adhikhmin, M., Gooch, B., Shirley, P.: Color transfer between images. IEEE Comput. Graph. Appl. 21(5), 34–41 (2001)

    Article  Google Scholar 

  3. Pitié, F., Kokaram, A.C., Dahyot, R.: Automated colour grading using colour distribution transfer. Comput. Vis. Image Understand. 107(1–2), 123–137 (2007)

    Article  Google Scholar 

  4. Pouli, T., Reinhard, E.: Progressive color transfer for images of arbitrary dynamic range. Comput. Graph. 35(1), 67–80 (2011)

    Article  Google Scholar 

  5. Ueda, Y., Misawa, H., Koga, T., Suetake, N., Uchino, E.: Idt and color transfer-based color calibration for images taken by different cameras. J. Adv. Comput. Intell. Intell. Inform. 24(1), 123–133 (2020)

    Article  Google Scholar 

  6. Hwang, Y., Lee, J.-Y., So Kweon, I., Joo Kim, S.: Color transfer using probabilistic moving least squares. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3342–3349 (2014)

  7. Grogan, M., Prasad, M., Dahyot, R.: L2 registration for colour transfer. In: 2015 23rd European Signal Processing Conference (EUSIPCO), IEEE, pp. 1–5 (2015)

  8. Grogan, M., Dahyot, R.: L2 divergence for robust colour transfer. Comput. Vis. Image Understand. 181, 39–49 (2019)

    Article  Google Scholar 

  9. Wu, Z., Xue, R.: Color transfer with salient features mapping via attention maps between images. IEEE Access 8, 104884–104892 (2020)

    Article  Google Scholar 

  10. Xu, D., Bao, S., Tanaka, G., Yang, C., Zuo, F.: A color calibration method based on color component projection for suppression of false color caused by iterative distribution transfer. J. Adv. Comput. Intell. Intell. Inform. 26(1), 88–96 (2022)

    Article  Google Scholar 

  11. Gu, C., Lu, X., Zhang, C.: Example-based color transfer with gaussian mixture modeling. Pattern Recogn. 129, 108716 (2022)

    Article  Google Scholar 

  12. Tai, Y.-W., Jia, J., Tang, C.-K.: Local color transfer via probabilistic segmentation by expectation-maximization. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), IEEE, vol. 1, pp. 747–754 (2005)

  13. Xiang, Y., Zou, B., Li, H.: Selective color transfer with multi-source images. Pattern Recogn. Lett. 30(7), 682–689 (2009)

    Article  ADS  Google Scholar 

  14. Liu, D., Jiang, Y., Pei, M., Liu, S.: Emotional image color transfer via deep learning. Pattern Recogn. Lett. 110, 16–22 (2018)

    Article  ADS  Google Scholar 

  15. Liu, S.: An overview of color transfer and style transfer for images and videos. arXiv preprint arXiv:2204.13339 (2022)

  16. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Commun. ACM 60(6), 84–90 (2017). https://doi.org/10.1145/3065386

    Article  Google Scholar 

  17. He, M., Liao, J., Yuan, L., Sander, P.V.: Neural color transfer between images. arXiv preprint arXiv:1710.007562 (2017)

  18. Chen, T.Q., Schmidt, M.: Fast patch-based style transfer of arbitrary style. arXiv preprint arXiv:1612.04337 (2016)

  19. Ulyanov, D., Vedaldi, A., Lempitsky, V.: Instance normalization: the missing ingredient for fast stylization. arXiv preprint arXiv:1607.08022 (2016)

  20. Zhu, J.-Y., Park, T., Isola, P., Efros, A.A.: Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2223–2232 (2017)

  21. Choi, Y., Choi, M., Kim, M., Ha, J.-W., Kim, S., Choo, J.: Stargan: unified generative adversarial networks for multi-domain image-to-image translation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 8789–8797 (2018)

  22. Luan, F., Paris, S., Shechtman, E., Bala, K.: Deep photo style transfer. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4990–4998 (2017)

  23. Cheng, M.-M., Mitra, N.J., Huang, X., Torr, P.H., Hu, S.-M.: Global contrast based salient region detection. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 569–582 (2014)

    Article  Google Scholar 

  24. Zou, A., Shen, X., Zhang, X., Wu, Z.: Neutral color correction algorithm for color transfer between multicolor images. In: Advances in Graphic Communication, Printing and Packaging Technology and Materials: Proceedings of 2020 11th China Academic Conference on Printing and Packaging. Springer, pp. 176–182 (2021)

  25. Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22(1), 79–86 (1951)

    Article  MathSciNet  Google Scholar 

  26. Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P.: Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13(4), 600–612 (2004)

    Article  ADS  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shi Bao.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bao, S., Zhao, Y., Ji, Y. et al. Color transfer method based on saliency features for color images. Opt Rev (2024). https://doi.org/10.1007/s10043-024-00888-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10043-024-00888-2

Keywords

Navigation