F-Measure Maximization in Multi-Label Classification with Conditionally Independent Label Subsets

  • Maxime Gasse
  • Alex Aussem
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9851)


We discuss a method to improve the exact F-measure maximization algorithm called GFM, proposed in [2] for multi-label classification, assuming the label set can be partitioned into conditionally independent subsets given the input features. If the labels were all independent, the estimation of only m parameters (m denoting the number of labels) would suffice to derive Bayes-optimal predictions in \(O(m^2)\) operations [10]. In the general case, \(m^2 + 1\) parameters are required by GFM, to solve the problem in \(O(m^3)\) operations. In this work, we show that the number of parameters can be reduced further to \(m^2/n\), in the best case, assuming the label set can be partitioned into n conditionally independent subsets. As this label partition needs to be estimated from the data beforehand, we use first the procedure proposed in [4] that finds such partition and then infer the required parameters locally in each label subset. The latter are aggregated and serve as input to GFM to form the Bayes-optimal prediction. We show on a synthetic experiment that the reduction in the number of parameters brings about significant benefits in terms of performance. The data and software related to this paper are available at


Multi-label classification F-measure Bayes optimal prediction Label dependence 



This work was funded by both the French state trough the Nano 2017 investment program and the European Community through the European Nanoelectronics Initiative Advisory Council (ENIAC Joint Undertaking), under grant agreement no 324271 (ENI.237.1.B2013).


  1. 1.
    Dembczynski, K., Jachnik, A., Kotlowski, W., Waegeman, W., Hüllermeier, E.: Optimizing the f-measure in multi-label classification: plug-in rule approach versus structured loss minimization. In: ICML (3). JMLR Proceedings vol. (28), pp. 1130–1138. (2013)Google Scholar
  2. 2.
    Dembczynski, K., Waegeman, W., Cheng, W., Hüllermeier, E.: An exact algorithm for f-measure maximization. In: Shawe-Taylor, J., Zemel, R.S., Bartlett, P.L., Pereira, F.C.N., Weinberger, K.Q. (eds.) NIPS, pp. 1404–1412 (2011)Google Scholar
  3. 3.
    Dembczynski, K., Waegeman, W., Cheng, W., Hüllermeier, E.: On label dependence and loss minimization in multi-label classification. Mach. Learn. 88(1–2), 5–45 (2012)MathSciNetCrossRefMATHGoogle Scholar
  4. 4.
    Gasse, M., Aussem, A., Elghazel, H.: On the optimality of multi-label classification under subset zero-one loss for distributions satisfying the composition property. In: Bach, F.R., Blei, D.M. (eds.) ICML. JMLR Proceedings, vol. 37, pp. 2531–2539. (2015)Google Scholar
  5. 5.
    Jansche, M.: A maximum expected utility framework for binary sequence labeling. In: Carroll, J.A., van den Bosch, A., Zaenen, A. (eds.) ACL. The Association for Computational Linguistics (2007)Google Scholar
  6. 6.
    Luaces, O., Díez, J., Barranquero, J., del Coz, J.J., Bahamonde, A.: Binary relevance efficacy for multilabel classification. Prog. AI 1(4), 303–313 (2012)Google Scholar
  7. 7.
    Smith, N., Tromble, R.: Sampling Uniformly from the Unit Simplex. Johns Hopkins University, Technical report (2004)Google Scholar
  8. 8.
    Venables, W.N., Ripley, B.D.: Modern Applied Statistics with S, 4th edn. Springer, New York (2002). ISBN 0-387-95457-0CrossRefMATHGoogle Scholar
  9. 9.
    Waegeman, W., Dembczynski, K., Jachnik, A., Cheng, W., Hüllermeier, E.: On the bayes-optimality of f-measure maximizers. J. Mach. Learn. Res. 15(1), 3333–3388 (2014)MathSciNetMATHGoogle Scholar
  10. 10.
    Ye, N., Chai, K.M., Lee, W.S., Chieu, H.L.: Optimizing f-measure: a tale of two approaches. In: Langford, J., Pineau, J. (eds.) ICML. pp. 289–296. ICML 2012, Omnipress, New York, NY, USA, July 2012Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.LIRIS, UMR 5205, University of Lyon 1LyonFrance

Personalised recommendations