Skip to main content
Log in

Map art style transfer with multi-stage framework

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

We propose a multi-stage framework to create the stylized map art images. Existing techniques are successful in transferring style in photos. Yet, the noise in results and the harmonization in the generated art images still need to be investigated. We address these issues with a proposed algorithm that defines a good portrait for map art application in the initial round. A refinement strategy is then applied to produce the final map arts that meet the aforementioned expectations. Beside our plausible results, the objective evaluation presented in this paper shows that our proposed method can interactively achieve better and appealing map art results in the comparison with those of other works. In addition, our method can also create ocean or landscape stylized paintings using our map art collage.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

Notes

  1. https://en.wikipedia.org/wiki/Non-photorealistic_rendering

  2. https://en.wikipedia.org/wiki/Kernel_(image_processing)

References

  1. Chi M-T, Lee T-Y (2005) Stylized and abstract painterly rendering system using a multiscale segmented sphere hierarchy. IEEE Trans Vis Comput Graph 12(1):61–72

    Google Scholar 

  2. Cornish D, Rowan A, Luebke D (2001) View-dependent particles for interactive non-photorealistic rendering. Graphics Interface 1:151–158

    Google Scholar 

  3. Deussen O, Strothotte T (2000) Computer-generated pen-and-ink illustration of trees. In: Proceedings of the 27th annual conference on Computer graphics and interactive techniques, pp 13–18

  4. Efros AA, Freeman WT (2001) Image quilting for texture synthesis and transfer. In: Proceedings of the 28th annual conference on Computer graphics and interactive techniques, pp 341–346

  5. Gatys LA, Ecker AS, Bethge M (2016) Image style transfer using convolutional neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2414–2423

  6. Hertzmann A, Jacobs CE, Oliver N, Curless B, Salesin DH (2001) Image analogies. In: Proceedings of the 28th annual conference on Computer graphics and interactive techniques, pp 327–340

  7. Jing Y, Yang Y, Feng Z, Ye J, Yu Y, Song M (2019) Neural style transfer: A review. IEEE transactions on visualization and computer graphics

  8. Kalnins RD, Markosian L, Meier BJ, Kowalski MA, Lee JC, Davidson PL, Webb M, Hughes JF, Finkelstein A (2002) Wysiwyg npr: Drawing strokes directly on 3d models. In: Poceedings of the 29th annual conference on computer graphics and interactive techniques, pp 755–762

  9. Lee T-Y, Yan C-R, Chi M-T (2007) Stylized rendering for anatomic visualization. Computing in Science & Engineering 9(1):13–19

    Article  Google Scholar 

  10. Li C, Wand M (2016) Combining Markov random fields and convolutional neural networks for image synthesis. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2479–2486

  11. Lin D, Wang Y, Xu G, Li J, Fu K (2018) Transform a simple sketch to a chinese painting by a multiscale deep neural network. Algorithms 11(1):4

    Article  MathSciNet  Google Scholar 

  12. Lin S-S, Hu M-C, Lee C-H, Lee T-Y (2015) Efficient QR code beautification with high quality visual content. IEEE Trans Multimed 17(9):1515–1524

    Article  Google Scholar 

  13. Lin S-S, Yeh I-C, Lin C-H, Lee T-Y (2012) Patch-based image warping for content-aware retargeting. IEEE Trans Multimed 15(2):359–368

    Article  Google Scholar 

  14. Lu M, Xu F, Zhao H, Yao A, Chen Y, Zhang L (2018) Exemplar-based portrait style transfer. IEEE Access 6:58532–58542

    Article  Google Scholar 

  15. Luan F, Paris S, Shechtman E, Bala K (2017) Deep photo style transfer. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4990–4998

  16. Luan F, Paris S, Shechtman E, Bala K (2018) Deep painterly harmonization. In: Computer graphics forum, vol 37. Wiley Online Library, pp 95–106

  17. Meier BJ (1996) Painterly rendering for animation. In: Proceedings of the 23rd annual conference on Computer graphics and interactive techniques, pp 477–484

    Google Scholar 

  18. Praun E, Hoppe H, Webb M, Finkelstein A (2001) Real-time hatching. In: Proceedings of the 28th annual conference on Computer graphics and interactive techniques, p 581

    Google Scholar 

  19. Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, Berg AC, Fei-Fei L (2015) ImageNet large scale visual recognition challenge. Int J Comput Vis (IJCV) 115(3):211–252

    Article  MathSciNet  Google Scholar 

  20. Selim A, Elgharib M, Doyle L (2016) Painting style transfer for head portraits using convolutional neural networks. ACM Transactions on Graphics (ToG) 35(4):1–18

    Article  Google Scholar 

  21. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv:1409.1556

  22. Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Transactions on Image Processing 13(4):600–612

    Article  Google Scholar 

  23. Xu H, Gossett N, Chen B (2004) Pointworks: Abstraction and rendering of sparsely scanned outdoor environments. In: Rendering Techniques, pp 45–52

  24. Yen C-R, Chi M-T, Lee T-Y, Lin W-C (2008) Stylized rendering using samples of a painted image. IEEE Trans Vis Comput Graph 14(2):468–480

    Article  Google Scholar 

  25. Zhu J-Y, Park T, Isola P, Efros AA (2017) Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the IEEE international conference on computer vision, pp 2223–2232

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tong-Yee Lee.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shih, CY., Chen, YH. & Lee, TY. Map art style transfer with multi-stage framework. Multimed Tools Appl 80, 4279–4293 (2021). https://doi.org/10.1007/s11042-020-09788-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-020-09788-4

Keywords

Navigation