International Journal of Computer Vision

, Volume 27, Issue 2, pp 107–126

Filters, Random Fields and Maximum Entropy (FRAME): Towards a Unified Theory for Texture Modeling

  • Song Chun Zhu
  • Yingnian Wu
  • David Mumford
Article

Abstract

This article presents a statistical theory for texture modeling. This theory combines filtering theory and Markov random field modeling through the maximum entropy principle, and interprets and clarifies many previous concepts and methods for texture analysis and synthesis from a unified point of view. Our theory characterizes the ensemble of images I with the same texture appearance by a probability distribution f(I) on a random field, and the objective of texture modeling is to make inference about f(I), given a set of observed texture examples.In our theory, texture modeling consists of two steps. (1) A set of filters is selected from a general filter bank to capture features of the texture, these filters are applied to observed texture images, and the histograms of the filtered images are extracted. These histograms are estimates of the marginal distributions of f( I). This step is called feature extraction. (2) The maximum entropy principle is employed to derive a distribution p(I), which is restricted to have the same marginal distributions as those in (1). This p(I) is considered as an estimate of f( I). This step is called feature fusion. A stepwise algorithm is proposed to choose filters from a general filter bank. The resulting model, called FRAME (Filters, Random fields And Maximum Entropy), is a Markov random field (MRF) model, but with a much enriched vocabulary and hence much stronger descriptive ability than the previous MRF models used for texture modeling. Gibbs sampler is adopted to synthesize texture images by drawing typical samples from p(I), thus the model is verified by seeing whether the synthesized texture images have similar visual appearances to the texture images being modeled. Experiments on a variety of 1D and 2D textures are described to illustrate our theory and to show the performance of our algorithms. These experiments demonstrate that many textures which are previously considered as from different categories can be modeled and synthesized in a common framework.

texture modeling texture analysis and synthesis minimax entropy maximum entropy Markov random field feature pursuit visual learning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barlow, H.B., Kaushal, T.P., and Mitchison, G.J. 1989. Finding minimum entropy codes. Neural Computation, 1:412-423.Google Scholar
  2. Bergen, J.R. and Adelson, E.H. 1991. Theories of visual texture perception. In Spatial Vision, D. Regan (Eds.), CRC Press.Google Scholar
  3. Besag, J. 1973. Spatial interaction and the statistical analysis of lattice systems (with discussion). J. Royal Stat. Soc., B, 36:192- 236.Google Scholar
  4. Besag, J. 1977. Efficiency of pseudolikelihood estimation for simple Gaussian fields. Biometrika, 64:616-618.Google Scholar
  5. Chubb, C. and Landy, M.S. 1991. Orthogonal distribution analysis: A new approach to the study of texture perception. In Comp. Models of Visual Proc., M.S Landy et al. (Eds.), MIT Press.Google Scholar
  6. Coifman, R.R. and Wickerhauser, M.V. 1992. Entropy based algorithms for best basis selection. IEEE Trans. on Information Theory, 38:713-718.CrossRefGoogle Scholar
  7. Cross, G.R. and Jain, A.K. 1983. Markov random field texture models. IEEE, PAMI, 5:25-39.Google Scholar
  8. Daubechies, I. 1992. Ten Lectures on Wavelets, Society for Industry and Applied Math: Philadephia, PA.Google Scholar
  9. Daugman, J. 1985. Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two-dimensional visual cortical filters. Journal of Optical Soc. Amer., 2(7).Google Scholar
  10. Diaconis, P. and Freedman, D. 1981. On the statistics of vision: The Julesz conjecture. Journal of Math. Psychology, 24.Google Scholar
  11. Donoho, D.L. and Johnstone, I.M. 1994. Ideal de-noising in an orthonormal basis chosen from a libary of bases. Acad. Sci. Paris, Ser. I. 319:1317-1322.Google Scholar
  12. Field, D. 1987. Relations between the statistics of natural images and the response properties of cortical cells. J. of Opt. Soc. Amer., 4(12).Google Scholar
  13. Gabor, D. 1946. Theory of communication. IEE Proc., 93(26).Google Scholar
  14. Geman, S. and Geman, D. 1984. Stochastic relaxation, Gibbs distribution, and the Bayesian restoration of images. IEEE Trans. PAMI, 6:721-741.Google Scholar
  15. Geman, S. and Graffigne, C. 1986. Markov random field image models and their applications to computer vision. In Proc. Int. Congress of Math., Berkeley, CA.Google Scholar
  16. Geyer, C.J. and Thompson, E.A. 1995. Annealing Markov chain Monto Carlo with applications to ancestral inference. J. of Amer. Stat. Assoc., 90:909-920.Google Scholar
  17. Haralick, R.M. 1979. Statistics and structural approach to texture. In Proc. IEEE, 67:786-804.Google Scholar
  18. Heeger, D.J. and Bergen, J.R. 1995. Pyramid-based texture analysis/ synthesis. Computer Graphics, in press.Google Scholar
  19. Jain, A.K. and Farrokhsia, F. 1991. Unsupervised texture segmentation using Gabor filters. Pattern Recognition, 24:1167-1186.CrossRefGoogle Scholar
  20. Jaynes, E.T. 1957. Information theory and statistical mechanics. Physical Review, 106:620-630.CrossRefGoogle Scholar
  21. Julesz, B. 1962. Visual pattern discrimination. IRE Trans. of Information Theory, IT-8:84-92.CrossRefGoogle Scholar
  22. Kullback, S. and Leibler, R.A. 1951. On information and sufficiency. Annual Math. Stat., 22:79-86.Google Scholar
  23. Lee, T.S. 1992. Image representation using 2D Gabor wavelets. To appear in IEEE Trans. of Pattern Analysis and Machine Intelligence.Google Scholar
  24. Mallat, S. 1989. Multiresolution approximations and wavelet orthonormal bases of L 2.R/. Trans. Amer. Math. Soc., 315:69-87.Google Scholar
  25. Mao, J. and Jain, A.K. 1992. Texture classification and segmentation using multiresolution simultaneous autoregressive models. Pattern Recognition, 25:173-188.CrossRefGoogle Scholar
  26. McCormick, B.H. and Jayaramamurthy, S.N. 1974. Time series models for texture synthesis. Int. J. Comput. Inform. Sci., 3:329-343.Google Scholar
  27. Popat, K. and Picard, R.W. 1993. Novel cluster-based probability model for texture synthesis, classification, and compression. In Proc. SPIE Visual Comm., Cambridge, MA.Google Scholar
  28. Qian, W. and Titterington, D.M. 1991. Multidimensional Markov chain models for image textures. J. Royal Stat. Soc., B, 53:661-674.Google Scholar
  29. Silverman, M.S., Grosof, D.H., De Valois, R.L., and Elfar, S.D. 1989. Spatial-frequency organization in primate striate cortex. In Proc. Natl. Acad. Sci. U.S.A., 86.Google Scholar
  30. Simoncelli, E.P., Freeman, W.T., Adelson, E.H., and Heeger, D.J. 1992. Shiftable multiscale transforms. IEEE Trans. on Information Theory, 38:587-607.CrossRefGoogle Scholar
  31. Tsatsanis, M.K. and Giannakis, G.B. 1992. Object and texture classification using higher order statistics. IEEE Trans on PAMI, 7:733-749.Google Scholar
  32. Winkler, G. 1995. Image Analysis, Random Fields and dynamic Monte Carlo Methods, Springer-Verlag.Google Scholar
  33. Witkin, A. and Kass, M. 1991. Reaction-diffusion textures. Computer Graphics, 25:299-308.Google Scholar
  34. Yuan, J. and Rao S.T. 1993. Spectral estimation for random fields with applications to Markov modeling and texture classification. Markov Random Fields, Chellappa and Jain (Eds.), pp. 179- 209.Google Scholar
  35. Zhu, S.C. and Yuille, A.L. 1996. Region Competition: unifying snakes, region growing, and Bayes/MDL for multi-band image segmentation. IEEE Trans. on PAMI, 18(9).Google Scholar
  36. Zhu, S.C., Wu, Y.N., and Mumford, D.B. 1996. Minimax entropy principle and its applications. Harvard Robotics Lab. Technique Report.Google Scholar

Copyright information

© Kluwer Academic Publishers 1998

Authors and Affiliations

  • Song Chun Zhu
    • 1
  • Yingnian Wu
    • 2
  • David Mumford
    • 3
  1. 1.Department of Computer ScienceStanford UniversityStanford
  2. 2.Department of StatisticsUniversity of MichiganAnn Arbor
  3. 3.Division of Applied MathBrown UniversityProvidence

Personalised recommendations