Skip to main content
Log in

Multi-label classification of legislative contents with hierarchical label attention networks

  • Published:
International Journal on Digital Libraries Aims and scope Submit manuscript

Abstract

EuroVoc is a thesaurus maintained by the European Union Publication Office, used to describe and index legislative documents. The EuroVoc concepts are organized following a hierarchical structure, with 21 domains, 127 micro-thesauri terms, and more than 6,700 detailed descriptors. The large number of concepts in the EuroVoc thesaurus makes the manual classification of legal documents highly costly. In order to facilitate this classification work, we present two main contributions. The first one is the development of a hierarchical deep learning model to address the classification of legal documents according to the EuroVoc thesaurus. Instead of training a classifier for each hierarchy level, our model allows the simultaneous prediction of the three levels of the EuroVoc thesaurus. Our second contribution concerns the proposal of a new legal corpus for evaluating the classification of documents written in Portuguese. This corpus, named EUR-Lex PT, contains more than 220k documents, labeled under the three EuroVoc hierarchical levels. Comparative experiments with other state-of-the-art models indicate that our approach has competitive results, at the same time offering the ability to interpret predictions through attention weights.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. http://manikvarma.org/downloads/XC/XMLRepository.html

  2. http://eur-lex.europa.eu/homepage.html

  3. http://data.europa.eu/euodp/data/dataset/eurovoc

  4. http://github.com/xiaohan2012/sleec_python

  5. https://github.com/google-research/bert

  6. http://github.com/dcaled/EUR-Lex-PT

References

  1. Babbar, R., Schölkopf, B.: DiSMEC: Distributed Sparse Machines for Extreme Multi-Label Classification. In: Proceedings of the Tenth ACM International Conference on Web Search and Data Mining, WSDM ’17, p. 721–729. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3018661.3018741

  2. Beltagy, I., Peters, M.E., Cohan, A.: Longformer: The long-document transformer. arXiv preprint arXiv:2004.05150 (2020)

  3. Bhatia, K., Jain, H., Kar, P., Varma, M., Jain, P.: Sparse Local Embeddings for Extreme Multi-Label Classification. In: Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 1, NIPS’15, p. 730–738. MIT Press, Cambridge, MA, USA (2015). https://doi.org/10.5555/2969239.2969321

  4. Boella, G., Di Caro, L., Lesmo, L., Rispoli, D., Robaldo, L.: Multi-label Classification of Legislative Text into EuroVoc. In: Proceedings of the International Conference on Legal Knowledge and Information Systems (2012)

  5. Caled, D., Won, M., Martins, B., Silva, M.J.: A hierarchical label network for multi-label eurovoc classification of legislative contents. In: Digital Libraries for Open Knowledge, pp. 238–252. Springer International Publishing, Cham (2019). https://doi.org/10.1007/978-3-030-30760-8_21

  6. Chalkidis, I., Fergadiotis, E., Malakasiotis, P., Androutsopoulos, I.: Large-Scale Multi-Label Text Classification on EU Legislation. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 6314–6322. Association for Computational Linguistics, Florence, Italy (2019). https://doi.org/10.18653/v1/P19-1636

  7. Chalkidis, I., Fergadiotis, M., Malakasiotis, P., Aletras, N., Androutsopoulos, I.: LEGAL-BERT: The Muppets straight out of law school. In: Findings of the Association for Computational Linguistics: EMNLP 2020, pp. 2898–2904. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.findings-emnlp.261. https://www.aclweb.org/anthology/2020.findings-emnlp.261

  8. Chang, W.C., Yu, H.F., Zhong, K., Yang, Y., Dhillon, I.: X-BERT: eXtreme Multi-label Text Classification using Bidirectional Encoder Representations from Transformers. arXiv preprint arXiv:1905.02331 (2019)

  9. Dai, Z., Yang, Z., Yang, Y., Carbonell, J., Le, Q., Salakhutdinov, R.: Transformer-XL: Attentive language models beyond a fixed-length context. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 2978–2988. Association for Computational Linguistics, Florence, Italy (2019). https://doi.org/10.18653/v1/P19-1285. https://www.aclweb.org/anthology/P19-1285

  10. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423

  11. Duarte, F., Martins, B., Pinto, C.S., Silva, M.J.: Deep neural models for ICD-10 coding of death certificates and autopsy reports in free-text. J. Biomed. Inform. 80, 64–77 (2018). https://doi.org/10.1016/j.jbi.2018.02.011

    Article  Google Scholar 

  12. Eger, S., Youssef, P., Gurevych, I.: Is it Time to Swish? Comparing Deep Learning Activation Functions Across NLP tasks. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 4415–4424. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/D18-1472

  13. Fadel, A., Tuffaha, I., Al-Ayyoub, M.: Pretrained ensemble learning for fine-grained propaganda detection. In: Proceedings of the Second Workshop on Natural Language Processing for Internet Freedom: Censorship, Disinformation, and Propaganda, pp. 139–142. Association for Computational Linguistics, Hong Kong, China (2019). https://doi.org/10.18653/v1/D19-5020. https://www.aclweb.org/anthology/D19-5020

  14. Hall, P.: Theoretical Comparison of Bootstrap Confidence Intervals. The Annals of Statistics 16(3), 927–953 (1988). https://doi.org/10.2307/2241604. http://www.jstor.org/stable/2241604

  15. Hartmann, N., Fonseca, E., Shulby, C., Treviso, M., Silva, J., Aluísio, S.: Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language Tasks. In: Proceedings of the Brazilian Symposium in Information and Human Language Technology, pp. 122–131. SBC, Porto Alegre, RS, Brasil (2017). https://sol.sbc.org.br/index.php/stil/article/view/4008

  16. Hochreiter, S., Schmidhuber, J.: Long Short-Term Memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  17. Jain, H., Prabhu, Y., Varma, M.: Extreme Multi-Label Loss Functions for Recommendation, Tagging, Ranking & Other Missing Label Applications. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, p. 935–944. Association for Computing Machinery, New York, NY, USA (2016). https://doi.org/10.1145/2939672.2939756

  18. Liu, J., Chang, W.C., Wu, Y., Yang, Y.: Deep Learning for Extreme Multi-Label Text Classification. In: Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’17, p. 115–124. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3077136.3080834

  19. Loza Mencía, E., Fürnkranz, J.: Efficient Multilabel Classification Algorithms for Large-Scale Problems in the Legal Domain. In: Proceedings of the Workshop on Semantic Processing of Legal Texts: Where the Language of Law Meets the Law of Language, pp. 192–215. Springer Berlin Heidelberg, Berlin, Heidelberg (2010). https://doi.org/10.1007/978-3-642-12837-0_11

  20. Mikolov, T., Sutskever, I., Chen, K., Corrado, G., Dean, J.: Distributed Representations of Words and Phrases and Their Compositionality. In: Proceedings of the 26th International Conference on Neural Information Processing Systems - Volume 2, NIPS’13, p. 3111–3119. Curran Associates Inc., Red Hook, NY, USA (2013). https://doi.org/10.5555/2999792.2999959

  21. Nam, J., Kim, J., Mencía, E.L., Gurevych, I., Fürnkranz, J.: Large-Scale Multi-label Text Classification - Revisiting Neural Networks. In: Proceedings of the 2014th European Conference on Machine Learning and Knowledge Discovery in Databases - Volume Part II, ECMLPKDD’14, p. 437–452. Springer-Verlag, Berlin, Heidelberg (2014). https://doi.org/10.1007/978-3-662-44851-9_28

  22. Prabhu, Y., Kag, A., Harsola, S., Agrawal, R., Varma, M.: Parabel: Partitioned Label Trees for Extreme Classification with Application to Dynamic Search Advertising. In: Proceedings of the 2018 World Wide Web Conference, WWW ’18, p. 993–1002. International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, CHE (2018). https://doi.org/10.1145/3178876.3185998

  23. Šaric, F., Bašic, B.D., Moens, M.F., Šnajder, J.: Multi-label classification of croatian legal documents using EuroVoc thesaurus. In: Proceedings of the Workshop on Semantic Processing of Legal Texts (2014)

  24. Sechidis, K., Tsoumakas, G., Vlahavas, I.: On the Stratification of Multi-label Data. In: D. Gunopulos, T. Hofmann, D. Malerba, M. Vazirgiannis (eds.) Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 145–158. Springer Berlin Heidelberg, Berlin, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23808-6_10

  25. Shaheen, Z., Wohlgenannt, G., Filtz, E.: Large Scale Legal Text Classification Using Transformer Models. arXiv preprint arXiv:2010.12871 (2020)

  26. Steinberger, R., Ebrahim, M., Turchi, M.: JRC EuroVoc Indexer JEX - A freely available multi-label categorisation tool. arXiv preprint arXiv:1309.5223 (2013)

  27. Tagami, Y.: AnnexML: Approximate Nearest Neighbor Search for Extreme Multi-Label Classification. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’17, p. 455–464. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3097983.3097987

  28. Wu, Y., Schuster, M., Chen, Z., Le, Q.V., Norouzi, M., Macherey, W., Krikun, M., Cao, Y., Gao, Q., Macherey, K., et al.: Google’s neural machine translation system: Bridging the gap between human and machine translation. arXiv preprint arXiv:1609.08144 (2016)

  29. Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489. Association for Computational Linguistics, San Diego, California (2016). https://doi.org/10.18653/v1/N16-1174

  30. Yen, I.E.H., Huang, X., Zhong, K., Ravikumar, P., Dhillon, I.S.: PD-Sparse: A Primal and Dual Sparse Approach to Extreme Multiclass and Multilabel Classification. In: Proceedings of the 33rd International Conference on International Conference on Machine Learning - Volume 48, ICML’16, p. 3069-3077. JMLR.org (2016). https://doi.org/10.5555/3045390.3045713

  31. You, R., Dai, S., Zhang, Z., Mamitsuka, H., Zhu, S.: AttentionXML: Extreme Multi-Label Text Classification with Multi-Label Attention Based Recurrent Neural Networks. arXiv preprint arXiv:1811.01727 (2018)

  32. Zhang, D., Li, T., Zhang, H., Yin, B.: On Data Augmentation for Extreme Multi-label Classification. arXiv preprint arXiv:2009.10778 (2020)

  33. Zhang, M.L., Zhou, Z.H.: A review on Multi-label learning algorithms. IEEE Trans. Knowledge and Data Eng. 26(8), 1819–1837 (2014). https://doi.org/10.1109/TKDE.2013.39

    Article  Google Scholar 

Download references

Acknowledgements

This research was partially supported by Portuguese national funds through Fundação para a Ciência e a Tecnologia (FCT) under references UIDB/50021/2020 and SFRH/BD/145561/2019. We also gratefully acknowledge NVIDIA Corporation, for the donation of the Titan Xp GPU used in our experiments, and Imprensa Nacional-Casa da Moeda (INCM). Finally, we thank Miguel Won for his contribution to the previous publication from which this article was derived.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Danielle Caled.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Caled, D., Silva, M.J., Martins, B. et al. Multi-label classification of legislative contents with hierarchical label attention networks. Int J Digit Libr 23, 77–90 (2022). https://doi.org/10.1007/s00799-021-00307-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00799-021-00307-w

Keywords

Navigation