Advertisement

Efficient Discovery of Expressive Multi-label Rules Using Relaxed Pruning

  • Yannik Klein
  • Michael RappEmail author
  • Eneldo Loza Mencía
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11828)

Abstract

Being able to model correlations between labels is considered crucial in multi-label classification. Rule-based models enable to expose such dependencies, e.g., implications, subsumptions, or exclusions, in an interpretable and human-comprehensible manner. Albeit the number of possible label combinations increases exponentially with the number of available labels, it has been shown that rules with multiple labels in their heads, which are a natural form to model local label dependencies, can be induced efficiently by exploiting certain properties of rule evaluation measures and pruning the label search space accordingly. However, experiments have revealed that multi-label heads are unlikely to be learned by existing methods due to their restrictiveness. To overcome this limitation, we propose a plug-in approach that relaxes the search space pruning used by existing methods in order to introduce a bias towards larger multi-label heads resulting in more expressive rules. We further demonstrate the effectiveness of our approach empirically and show that it does not come with drawbacks in terms of training time or predictive performance.

Keywords

Multi-label classification Rule learning Label dependencies 

Notes

Acknowledgments

This research was supported by the German Research Foundation (DFG) (grant number FU 580/11). Calculations for this research were conducted on the Lichtenberg high performance computer of the TU Darmstadt.

References

  1. 1.
    Allamanis, M., Tzima, F.A., Mitkas, P.A.: Effective rule-based multi-label classification with learning classifier systems. In: Tomassini, M., Antonioni, A., Daolio, F., Buesser, P. (eds.) ICANNGA 2013. LNCS, vol. 7824, pp. 466–476. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-37213-1_48CrossRefGoogle Scholar
  2. 2.
    Arunadevi, J., Rajamani, V.: An evolutionary multi label classification using associative rule mining for spatial preferences. In: IJCA Special Issue on Artificial Intelligence Techniques - Novel Approaches and Practical Applications (2011)CrossRefGoogle Scholar
  3. 3.
    Ávila-Jiménez, J.L., Gibaja, E., Ventura, S.: Evolving multi-label classification rules with gene expression programming: a preliminary study. In: Corchado, E., Graña Romay, M., Manhaes Savio, A. (eds.) HAIS 2010. LNCS (LNAI), vol. 6077, pp. 9–16. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-13803-4_2CrossRefGoogle Scholar
  4. 4.
    Charte, F., Rivera, A.J., del Jesús, M.J., Herrera, F.: LI-MLC: a label inference methodology for addressing high dimensionality in the label space for multilabel classification. IEEE Trans. Neural Netw. Learn. Syst. 25(10), 1842–1854 (2014) CrossRefGoogle Scholar
  5. 5.
    Dembczyński, K., Waegeman, W., Cheng, W., Hüllermeier, E.: On label dependence and loss minimization in multi-label classification. Mach. Learn. 88(1–2), 5–45 (2012)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Lakkaraju, H.M, Bach, S.H., Leskovec, J.: Interpretable decision sets: a joint framework for description and prediction. In: International Conference on Knowledge Discovery and Data Mining (2016)Google Scholar
  7. 7.
    Li, B., Li, H., Wu, M., Li, P.: Multi-label classification based on association rules with application to scene classification. In: The 9th International Conference for Young Computer Scientists (2008)Google Scholar
  8. 8.
    Mencía, E.L., Fürnkranz, J., Hüllermeier, E., Rapp, M.: Learning interpretable rules for multi-label classification. In: Escalante, H.J., et al. (eds.) Explainable and Interpretable Models in Computer Vision and Machine Learning. TSSCML, pp. 81–113. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-98131-4_4CrossRefGoogle Scholar
  9. 9.
    Mencía, E.L., Janssen, F.: Learning rules for multi-label classification: a stacking and a separate-and-conquer approach. Mach. Learn. 105(1), 77–126 (2016)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Papagiannopoulou, C., Tsoumakas, G., Tsamardinos, I.: Discovering and exploiting deterministic label relationships in multi-label learning. In: ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2015)Google Scholar
  11. 11.
    Park, S.-H., Fürnkranz, J.: Multi-label classification with label constraints. In: ECML PKDD 2008 Workshop on Preference Learning (2008)Google Scholar
  12. 12.
    Rapp, M., Loza Mencía, E., Fürnkranz, J.: Exploiting anti-monotonicity of multi-label evaluation measures for inducing multi-label rules. In: Phung, D., Tseng, V.S., Webb, G.I., Ho, B., Ganji, M., Rashidi, L. (eds.) PAKDD 2018. LNCS (LNAI), vol. 10937, pp. 29–42. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-93034-3_3CrossRefGoogle Scholar
  13. 13.
    Thabtah, F.A., Cowling, P.I., Peng, Y.: Multiple labels associative classification. Knowl. Inf. Syst. 9(1), 109–129 (2006)CrossRefGoogle Scholar
  14. 14.
    Tsoumakas, G., Katakis, I., Vlahavas, I.: Mining Multi-label Data. In: Maimon, O., Rokach, L. (eds.) Data Mining and Knowledge Discovery Handbook. Springer, Boston (2009)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Yannik Klein
    • 1
  • Michael Rapp
    • 1
    Email author
  • Eneldo Loza Mencía
    • 1
  1. 1.Knowledge Engineering GroupTU DarmstadtDarmstadtGermany

Personalised recommendations