Skip to main content
Log in

Dating ancient paintings of Mogao Grottoes using deeply learnt visual codes

  • Research Paper
  • Published:
Science China Information Sciences Aims and scope Submit manuscript

Abstract

Cultural heritage is the asset of all the peoples of the world. The preservation and inheritance of cultural heritage is conducive to the progress of human civilization. In northwestern China, there is a world heritage site – Mogao Grottoes – that has a plenty of mural paintings showing the historical cultures of ancient China. To study these historical cultures, one critical procedure is to date the mural paintings, i.e., determining the era when they were created. Until now, most mural paintings at Mogao Grottoes have been dated by directly referring to the mural texts or historical documents. However, some are still left with creation-era undetermined due to the lack of reference materials. Considering that the drawing style of mural paintings was changing along the history and the drawing style can be learned and quantified through painting data, we formulate the problem of mural-painting dating into a problem of drawing-style classification. In fact, drawing styles can be expressed not only in color or curvature, but also in some unknown forms – the forms that have not been observed. To this end, besides sophisticated color and shape descriptors, a deep convolution neural network is designed to encode the implicit drawing styles. 3860 mural paintings collected from 194 different grottoes with determined creation-era labels are used to train the classification model and build the dating method. In experiments, the proposed dating method is applied to seven mural paintings which were previously dated with controversies, and the exciting new dating results are approved by the Dunhuang experts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Taylor R P, Micolich A P, Jonas D. Fractal analysis of pollock’s drip paintings. Nature, 1999, 399: 422

    Article  Google Scholar 

  2. Lyu S, Rockmore D, Farid H. A digital technique for art authentication. Proc Natl Acad Sci USA, 2004, 101: 17006–17010

    Article  Google Scholar 

  3. Johnson C R, Hendriks E, Berezhnoy I J, et al. Image processing for artist identification. IEEE Signal Process Mag, 2008, 25: 37–48

    Article  Google Scholar 

  4. Hughes J M, Graham D J, Rockmore D N. Quantification of artistic style through sparse coding analysis in the drawings of Pieter Bruegel the Elder. Proc Natl Acad Sci USA, 2010, 107: 1279–1283

    Article  Google Scholar 

  5. Hughes J M, Foti N J, Krakauer D C, et al. From the cover: quantitative patterns of stylistic influence in the evolution of literature. Proc Natl Acad Sci USA, 2012, 109: 7682–7686

    Article  Google Scholar 

  6. Qi H, Taeb A, Hughes S M. Visual stylometry using background selection and wavelet-HMT-based Fisher information distances for attribution and dating of impressionist paintings. Signal Process, 2013, 93: 541–553

    Article  Google Scholar 

  7. Kim D, Son S-W, Jeong H. Large-scale quantitative analysis of painting arts. Sci Rep, 2014, 4: 7370

    Article  Google Scholar 

  8. Alameda-Pineda X, Ricci E, Yan Y, et al. Recognizing emotions from abstract paintings using non-linear matrix completion. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, 2016. 5240–5248

    Google Scholar 

  9. Ornes S. Science and culture: charting the history of Western art with math. Proc Natl Acad Sci USA, 2015, 112: 7619–7620

    Article  MathSciNet  MATH  Google Scholar 

  10. Lowe D G. Object recognition from local scale-invariant features. In: Proceedings of the 7th IEEE International Conference on Computer Vision, Corfu, 1999. 1150–1157

    Chapter  Google Scholar 

  11. Perronnin F, Sánchez J, Mensink T. Improving the fisher kernel for large-scale image classification. In: Proceedings of European Conference on Computer Vision, Heraklion, 2010. 143–156

    Google Scholar 

  12. Perronnin F, Dance C. Fisher kernels on visual vocabularies for image categorization. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, 2007. 1–8

    Google Scholar 

  13. van de Weijer J, Schmid C, Verbeek J, et al. Learning color names for real-world applications. IEEE Trans Image Process, 2009, 18: 1512–1523

    Article  MathSciNet  MATH  Google Scholar 

  14. Benavente R, Vanrell M, Baldrich R. Parametric fuzzy sets for automatic color naming. J Opt Soc Am A, 2008, 25: 2582–2593

    Article  Google Scholar 

  15. Berlin B, Kay P. Basic Color Terms: Their Universality and Evolution. Berkeley: University of California Press, 1991

    Google Scholar 

  16. Khan R, van de Weijer J, Shahbaz Khan F, et al. Discriminative color descriptors. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Portland, 2013. 2876–2883

    Google Scholar 

  17. Zou Q, Qi X B, Li Q Q, et al. Discriminative regional color co-occurrence descriptor. In: Proceedings of International Conference on Image Processing, Quebec City, 2015. 696–700

    Google Scholar 

  18. Zou Q, Ni L, Wang Q, et al. Local pattern collocations using regional co-occurrence factorization. IEEE Trans Multimedia, 2017, 19: 492–505

    Article  Google Scholar 

  19. Deng C W, Huang G B, Xu J, et al. Extreme learning machines: new trends and applications. Sci China Inf Sci, 2015, 58: 020301

    Google Scholar 

  20. Hinton G E, Salakhutdinov R R. Reducing the dimensionality of data with neural networks. Science, 2006, 313: 504–507

    Article  MathSciNet  MATH  Google Scholar 

  21. Bengio Y, Lamblin P, Popovici D, et al. Greedy layer-wise training of deep networks. In: Proceedings of Annual Conference on Neural Information Processing Systems (NIPS). Cambridge: MIT Press, 2006. 153–160

    Google Scholar 

  22. Gao W, Zhou Z H. Dropout rademacher complexity of deep neural networks. Sci China Inf Sci, 2016, 59: 072104

    Article  Google Scholar 

  23. Deng L, Yu D. Deep learning: methods and applications. Trends Signal Process, 2013, 7: 197–387

    Article  MathSciNet  MATH  Google Scholar 

  24. LeCun Y, Boser B, Denker J S, et al. Backpropagation applied to handwritten zip code recognition. Neural Comput, 1989, 1: 541–551

    Article  Google Scholar 

  25. Krizhevsky A, Sutskever I, Hinton G E. Imagenet classification with deep convolutional neural networks. In: Proceedings of the 25th International Conference on Neural Information Processing Systems, Lake Tahoe, 2012. 1097–1105

    Google Scholar 

  26. Szegedy C, Liu W, Jia Y Q, et al. Going deeper with convolutions. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Boston, 2015. 1–9

    Google Scholar 

  27. Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. 2015. ArXiv:1409.1556

    Google Scholar 

  28. Gatys L A, Ecker A S, Bethge M. Image style transfer using convolutional neural networks. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, 2016. 2414–2423

    Google Scholar 

  29. Crowley E J, Zisserman A. In search of art. In: Computer Vision - ECCV 2014 Workshops. Berlin: Springer, 2014. 54–70

    Google Scholar 

  30. Nair V, Hinton G E. Rectified linear units improve restricted boltzmann machines. In: Proceedings of the 27th International Conference on International Conference on Machine Learning, Haifa, 2010. 807–814

    Google Scholar 

  31. Dong Z, Liang W, Wu Y W, et al. Nonnegative correlation coding for image classification. Sci China Inf Sci, 2016, 59: 012105

    Google Scholar 

  32. Chen L, Chen J, Zou Q, et al. Multi-view feature combination for ancient paintings chronological classification. J Comput Cult Herit, 2017, 10: 701–715

    Article  Google Scholar 

  33. Sande K, Gevers T, Snoek C G M. Evaluating color descriptors for object and scene recognition. IEEE Trans Pattern Anal Mach Intell, 2010, 32: 1582–1596

    Article  Google Scholar 

  34. Aitken M J. Science-based dating in archaeology. New York: Routledge, 2014

    Google Scholar 

  35. Pike A W, Hoffmann D L, Garcia-Diez M, et al. U-series dating of Paleolithic art in 11 caves in Spain. Science, 2012, 336: 1409–1413

    Article  Google Scholar 

  36. Wang H. A study of shi-bao Guanyin and shi-gan-lu Guanyin images at cave 205 of the Mogao Grottes. J Dunhuang Studies, 2010, 1: 58–65

    Google Scholar 

Download references

Acknowledgements

This work was supported by National Basic Research Program of China (Grant No. 2012CB725303), Major Program of Key Research Institute on Humanities and Social Science of the Chinese Ministry of Education (Grant No. 16JJD870002), and National Natural Science Foundation of China (Grant Nos. 91546106, 61301277). The authors would like to thank the Dunhuang Research Academia for providing the moral paintings of Dunhuang-P7, and thank Mr. Hui-Min WANG for helpful suggestions and discussions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qin Zou.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, Q., Zou, Q., Ma, D. et al. Dating ancient paintings of Mogao Grottoes using deeply learnt visual codes. Sci. China Inf. Sci. 61, 092105 (2018). https://doi.org/10.1007/s11432-017-9308-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11432-017-9308-x

Keywords

Navigation