Journal of Computer Science and Technology

, Volume 28, Issue 5, pp 762–775 | Cite as

Stroke Style Analysis for Painterly Rendering

Regular Paper


We propose a novel method that automatically analyzes stroke-related artistic styles of paintings. A set of adaptive interfaces are also developed to connect the style analysis with existing painterly rendering systems, so that the specific artistic style of a template painting can be effectively transferred to the input photo with minimal effort. Different from conventional texture-synthesis based rendering techniques that focus mainly on texture features, this work extracts, analyzes and simulates high-level style features expressed by artists’ brush stroke techniques. Through experiments, user studies and comparisons with ground truth, we demonstrate that the proposed style-orientated painting framework can significantly reduce tedious parameter adjustment, and it allows amateur users to efficiently create desired artistic styles simply by specifying a template painting.


non-photorealistic rendering example-based rendering style analysis brush stroke technique 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

11390_2013_1375_MOESM1_ESM.docx (13 kb)
ESM 1 (DOCX 13 kb)


  1. 1.
    Hertzmann A, Jacobs C E, Oliver N et al. Image analogies. In Proc. the 28th SIGGRAPH, Aug. 2001, pp.327-340.Google Scholar
  2. 2.
    Wang B, Wang W, Yang H et al. Efficient example-based painting and synthesis of 2D directional texture. IEEE Trans. Visualization and Computer Graphics, 2004, 10(3): 266-277.CrossRefGoogle Scholar
  3. 3.
    Lee H, Seo S, Ryoo S, Yoon K. Directional texture transfer. In Proc. the 8th Int. Symp. Non-Photorealistic Animation and Rendering, June 2010, pp.43-48.Google Scholar
  4. 4.
    Litwinowicz P. Processing images and video for an impres- sionist effect. In Proc. the 24th SIGGRAPH, Aug. 1997, pp.407-414.Google Scholar
  5. 5.
    Hertzmann A. Painterly rendering with curved brush strokes of multiple sizes. In Proc. the 25th SIGGRAPH, Aug 1998, pp.453-460.Google Scholar
  6. 6.
    Hays J, Essa I. Image and video based painterly animation. In Proc. the 3rd Int. Symp. Non-Photorealistic Animation and Rendering, June 2004, pp.113-120.Google Scholar
  7. 7.
    Kagaya M, Brendel W, Deng Q Q et al. Video painting with space-time-varying style parameters. IEEE Transactions on Visualization and Computer Graphics, 2011, 17(1): 74-87.CrossRefGoogle Scholar
  8. 8.
    Huang H, Zhang L, Fu T N. Video painting via motion layer manipulation. Computer Graphics Forum, 2010, 29(7): 2055-2064.CrossRefGoogle Scholar
  9. 9.
    Lee H, Lee C H, Yoon K. Motion based painterly rendering. Computer Graphics Forum, 2009, 28(4): 1207-1215.CrossRefGoogle Scholar
  10. 10.
    Zeng K, Zhao M, Xiong C, Zhu S C. From image parsing to painterly rendering. ACM Transactions on Graphics (TOG), 2009, 29(1): Article No. 2.Google Scholar
  11. 11.
    Lyu S, Rockmore D, Farid H. A digital technique for art au- thentication. In Proc. the National Academy of Sciences of the United States of America, 2004, 101(49): 17006-17010.Google Scholar
  12. 12.
    Li J, Wang J Z. Studying digital imagery of ancient paint- ings by mixtures of stochastic models. IEEE Transactions on Image Processing, 2004, 13(3): 340-353.CrossRefGoogle Scholar
  13. 13.
    Yelizaveta M, Chua T S, Ramesh J. Semi-supervised anno- tation of brushwork in paintings domain using serial combi- nations of multiple experts. In Proc. the 14th Annual ACM Int. Conf. Multimedia, Oct. 2006, pp.529-538.Google Scholar
  14. 14.
    Hertzmann A. Fast paint texture. In Proc. the 2nd Inter- national Symposium on Non-Photorealistic Animation and Rendering, June 2002, pp.91-96.Google Scholar
  15. 15.
    Kalogerakis E, Nowrouzezahrai D, Breslav S, Hertzmann A. Learning hatching for pen-and-ink illustration of surfaces. ACM Transactions on Graphics (TOG), 2012, 31(1): 1-10.CrossRefGoogle Scholar
  16. 16.
    Melzer T, Kammerer P, Zolda E. Stroke detection of brush strokes in portrait miniatures using a semi-parametric and a model based approach. In Proc. the 14th International Conference on Pattern Recognition, Aug. 1998, pp.474-476.Google Scholar
  17. 17.
    Johnson C R, Hendriks E, Berezhnoy I J et al. Image pro- cessing for artist identification. IEEE Signal Processing Magazine, 2008, 25(4): 37-48.CrossRefGoogle Scholar
  18. 18.
    Gabor D. A new microscopic principle. Nature, 1948, 161(4098): 777-778.CrossRefGoogle Scholar
  19. 19.
    Turner M R. Texture discrimination by Gabor functions. Bi- ological Cybernetics, 1986, 55(2/3): 71-82.Google Scholar
  20. 20.
    Fogel I, Sagi D. Gabor filters as texture discriminator. Bio- logical Cybernetics, 1989, 61(2): 103-113.CrossRefGoogle Scholar
  21. 21.
    Jain A K, Farrokhnia F. Unsupervised texture segmentation using Gabor filters. Pattern Recognition, 1991, 24(12): 1167-1186.CrossRefGoogle Scholar
  22. 22.
    Daugman J G. Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two- dimensional visual cortical filters. Journal of the Optical So- ciety of America A, 1985, 2(7): 1160-1169.CrossRefGoogle Scholar
  23. 23.
    Kruizinga P, Petkov N. Nonlinear operator for oriented tex- ture. IEEE Transactions on Image Processing, 1999, 8(10): 1395-1407.MathSciNetCrossRefGoogle Scholar
  24. 24.
    Huang H, Fu T N, Li C F. Painterly rendering with content- dependent natural paint strokes. The Visual Computer, 2011, 27(9): 861-871.CrossRefGoogle Scholar
  25. 25.
    Huang H, Zang Y, Li C F. Example-based painting guided by color features. The Visual Computer, 2010, 26(6/8): 933-942.CrossRefGoogle Scholar
  26. 26.
    Cortes C, Vapnik V. Support-vector networks. Machine Learning, 1995, 20(3): 273-297.MATHGoogle Scholar
  27. 27.
    Comaniciu D, Meer P. Mean shift: A robust approach to- ward feature space analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002, 24(5): 603-619.CrossRefGoogle Scholar
  28. 28.
    Tsai W H. Moment-preserving thresholding: A new approach. Computer Vision, Graphics and Image Processing, 1985, 29(3): 377-393.CrossRefGoogle Scholar
  29. 29.
    Huang N E, Shen Z, Long S R, Wu M C, Shih H H, Zheng Q, Yen N C, Tung C C, Liu H H. The empirical mode de- composition and the Hilbert spectrum for nonlinear and non- stationary time series analysis. Proc. the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences, 1998, 454(1971): 903-91.MathSciNetCrossRefMATHGoogle Scholar
  30. 30.
    Gao Y, Li C F, Ren Bo, Hu S M. View-dependent multiscale fluid simulation. IEEE Transactions on Visualization and Computer Graphics, 2013, 19(2): 178-188.CrossRefGoogle Scholar
  31. 31.
    Tang Y, Shi X Y, Xiao T Z, Fan J. An improved image analogy method based on adaptive CUDA-accelerated neigh- borhood matching framework. The Visual Computer, 2012, 28(6): 743-753.CrossRefGoogle Scholar
  32. 32.
    Li X Y, Gu Y, Hu S M, Martin R. Mixed-domain edge-aware image manipulation. IEEE Transactions on Image Processing, 2013, 22(5): 1915-1925.MathSciNetCrossRefGoogle Scholar
  33. 33.
    Zhang S H, Li X Y, Hu S M, Martin R. Online video stream abstraction and stylization. IEEE Transactions on Multimedia, 2011, 13(6): 1286-1294.CrossRefGoogle Scholar
  34. 34.
    Wang X H, Jia J, Liao H Y, Cai L H. Affective image colorization. Journal of Computer Science and Technology, 2012, 27(6): 1119-1128.CrossRefGoogle Scholar
  35. 35.
    Wang Z, Bovik A C, Sheikh H R, Simoncelli E P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Processing, 2004, 13(4): 600-612.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York & Science Press, China 2013

Authors and Affiliations

  1. 1.School of Electronics and Information EngineeringXi’an Jiaotong UniversityXi’anChina
  2. 2.Beijing Key Laboratory of Intelligent Information Technology, School of Computer Science and TechnologyBeijing Institute of TechnologyBeijingChina
  3. 3.College of EngineeringSwansea UniversitySwanseaUK

Personalised recommendations