Advertisement

Combining Simple Discriminators for Object Discrimination

  • Shyjan Mahamud
  • Martial Hebert
  • John Lafferty
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2352)

Abstract

We propose to combine simple discriminators for object discrimination under the maximum entropy framework or equivalently under the maximum likelihood framework for the exponential family. The duality between the maximum entropy framework and maximum likelihood framework allows us to relate two selection criteria for the discriminators that were proposed in the literature. We illustrate our approach by combining nearest prototype discriminators that are simple to implement and widely applicable as they can be constructed in any feature space with a distance function. For efficient run-time performance we adapt the work on “alternating trees” for multi-class discrimination tasks. We report results on a multi-class discrimination task in which significant gains in performance are seen by combining discriminators under our framework from a variety of easy to construct feature spaces.

Keywords

Feature Space Training Image Voronoi Diagram Image Space Exponential Family 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Chen, S., Rosenfeld, R.: A survey of smoothing techniques for ME models. IEEE Transactions on Speech and Audio Processing 8(1) (2000)Google Scholar
  2. 2.
    Freund, Y., Shapire, R.E.: A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. J. of Computer and System Sciences 55(1) (1997) 119–139zbMATHCrossRefGoogle Scholar
  3. 3.
    Freund, Y., Mason, L.: The Alternating Decision Tree Algorithm. ICML99 (1999) 124–133Google Scholar
  4. 4.
    Haralick, R.M.: Statistical and Structural Approaches to Texture. Proc. 4th Intl. Joint Conf. Pattern Recognition (1979) 45–60Google Scholar
  5. 5.
    Huttenlocher, D., Klanderman, G., Rucklidge, W.: Comparing Images Using the Hausdorff Distance. IEEE Transactions on Pattern Analysis and Machine Intelligence 15(9) (1993) 850–863CrossRefGoogle Scholar
  6. 6.
    Jaynes, E.T.: Information theory and statistical mechanics. Physical Review 106 (1957) 620–630CrossRefMathSciNetGoogle Scholar
  7. 7.
    Lebanon, G., Lafferty, J.: Boosting and Maximum Likelihood for Exponential Models. Advances in Neural Information Processing Systems 14 (2001)Google Scholar
  8. 8.
    Della Pietra, S., Della Pietra, V., Lafferty, J.: Inducing features of Random fields. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(4) (1997)Google Scholar
  9. 9.
    Schapire, R. E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37(3) (1999) 297–336zbMATHCrossRefGoogle Scholar
  10. 10.
    Zhu, S. C., Wu, Y., Mumford, D.: Filters, Random Fields and Maximum Entropy (FRAME). IJCV 27(2) (1998) 1–20CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Shyjan Mahamud
    • 1
  • Martial Hebert
    • 1
  • John Lafferty
    • 1
  1. 1.Dept. of Computer ScienceCarnegie Mellon UniversityPittsburgh

Personalised recommendations