Support Vector Random Fields for Spatial Classification

  • Chi-Hoon Lee
  • Russell Greiner
  • Mark Schmidt
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3721)

Abstract

In this paper we propose Support Vector Random Fields (SVRFs), an extension of Support Vector Machines (SVMs) that explicitly models spatial correlations in multi-dimensional data. SVRFs are derived as Conditional Random Fields that take advantage of the generalization properties of SVMs. We also propose improvements to computing posterior probability distributions from SVMs, and present a local-consistency potential measure that encourages spatial continuity. SVRFs can be efficiently trained, converge quickly during inference, and can be trivially augmented with kernel functions. SVRFs are more robust to class imbalance than Discriminative Random Fields (DRFs), and are more accurate near edges. Our results on synthetic data and a real-world tumor detection task show the superiority of SVRFs over both SVMs and DRFs.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Lafferty, J., Pereira, F., McCallum, A.: Conditional random fields: Probabilistic models for segmenting and labeling sequence data. In: ICML 2001 (2001)Google Scholar
  2. 2.
    Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)Google Scholar
  3. 3.
    Li, S.Z.: Markov Random Field Modeling in Image Analysis. Springer, Heidelberg (2001)MATHGoogle Scholar
  4. 4.
    Kumar, S., Hebert, M.: Discriminative random fields: A discriminative framework for contextual interaction in classification. In: ICCV 2003, pp. 1150–1157 (2003)Google Scholar
  5. 5.
    Besag, J.: On the statistical analysis of dirty pictures. Journal of Royal Statistical Society Series B 48(3), 259–302 (1986)MathSciNetGoogle Scholar
  6. 6.
    Kumar, S., Hebert, M.: Discriminative fields for modeling spatial dependencies in natural images. In: NIPS (2003)Google Scholar
  7. 7.
    Fletcher, R.: Practical Methods of Optimization. John Wiley & Sons, Chichester (1987)MATHGoogle Scholar
  8. 8.
    Platt, J.: Probabilistic outputs for support vector machines and comparison to regularized likelihood methods. MIT Press, Cambridge (2000)Google Scholar
  9. 9.
    Joachims, T.: Making large-scale svm learning practical. In: Scholkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods - Support Vector Learning. MIT Press, Cambridge (1999)Google Scholar
  10. 10.
    Gering, D.: Recognizing Deviations from Normalcy for Brain Tumor Segmentation. PhD thesis. MIT, Cambridge (2003)Google Scholar
  11. 11.
    Zhang, J., Ma, K., Er, M., Chong, V.: Tumor segmentation from magnetic resonance imaging by learning via one-class support vector machine. In: Int. Workshop on Advanced Image Technology, pp. 207–211 (2004)Google Scholar
  12. 12.
    Garcia, C., Moreno, J.: Kernel based method for segmentation and modeling of magnetic resonance images. In: Lemaître, C., Reyes, C.A., González, J.A. (eds.) IBERAMIA 2004. LNCS (LNAI), vol. 3315, pp. 636–645. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  13. 13.
    Statistical parametric mapping (Online), http://www.fil.ion.bpmf.ac.uk/spm/
  14. 14.
    Hellier, P., Ashburner, J., Corouge, I., Barillot, C., Friston, K.: Inter subject registration of functional and anatomical data using spm. In: Dohi, T., Kikinis, R. (eds.) MICCAI 2002. LNCS, vol. 2489, pp. 587–590. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  15. 15.
    Ashburner, J.: Another mri bias correction approach. In: 8th Int. Conf. on Functional Mapping of the Human Brain, Sendai, Japan (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Chi-Hoon Lee
    • 1
  • Russell Greiner
    • 1
  • Mark Schmidt
    • 1
  1. 1.Department of Computing ScienceUniversity of AlbertaEdmonton ABCanada

Personalised recommendations