Entropy and Margin Maximization for Structured Output Learning

  • Patrick Pletscher
  • Cheng Soon Ong
  • Joachim M. Buhmann
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6323)

Abstract

We consider the problem of training discriminative structured output predictors, such as conditional random fields (CRFs) and structured support vector machines (SSVMs). A generalized loss function is introduced, which jointly maximizes the entropy and the margin of the solution. The CRF and SSVM emerge as special cases of our framework. The probabilistic interpretation of large margin methods reveals insights about margin and slack rescaling. Furthermore, we derive the corresponding extensions for latent variable models, in which training operates on partially observed outputs. Experimental results for multiclass, linear-chain models and multiple instance learning demonstrate that the generalized loss can improve accuracy of the resulting classifiers.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Patrick Pletscher
    • 1
  • Cheng Soon Ong
    • 1
  • Joachim M. Buhmann
    • 1
  1. 1.Department of Computer ScienceETH ZürichSwitzerland

Personalised recommendations