Advertisement

Adding Thesaurus Information into Probabilistic Topic Models

  • Natalia LoukachevitchEmail author
  • Michael Nokel
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10415)

Abstract

In this paper we present an approach of introducing thesaurus information into probabilistic topic models. The main idea of the approach is based on the assumption that the frequencies of semantically related words and phrases, which met in the same texts, should be enhanced and this action leads to their larger contribution into topics found in these texts. The experiments demonstrate that the direct implementation of this idea using WordNet synonyms or direct relations leads to great degradation of the initial model. But the correction of the initial assumption improves the model and makes it better than the initial model in several measures. Adding ngrams in similar manner further improves the model.

Keywords

Thesaurus Multiword expression Probabilistic topic models 

Notes

Acknowledgments

This work was partially supported by Russian National Foundation, grant N16-18-02074.

References

  1. 1.
    Blei, D.: Probabilistic topic models. Commun. ACM 55(4), 77–84 (2012)CrossRefGoogle Scholar
  2. 2.
    Smith, A., Lee, T.Y., Poursabzi-Sangdeh, F., Boyd-Graber, J., Elmqvist, N., Findlater, L.: Evaluating visual representations for topic understanding and their effects on manually generated labels. Trans. Assoc. Comput. Linguist. 5, 1–15 (2016)Google Scholar
  3. 3.
    Chang, J., Boyd-Graber, J., Wang, C.H., Gerrich S., Blei, D.: Reading tea leaves: how humans interpret topic models. In: Proceedings of the 24th Annual Conference on Neural Information Processing Systems, pp. 288–296 (2009)Google Scholar
  4. 4.
    Boyd-Graber, J., Mimno, D., Newman, D.: Care and Feeding of Topic Models: Problems, Diagnostics, and Improvements. CRC Handbooks of Modern Statistical Methods. CRC Press, Boca Raton (2014)Google Scholar
  5. 5.
    Blei, D., Lafferty, J.: Visualizing topics with multi-word expressions (2009). https://arxiv.org/pdf/0907.1013.pdf
  6. 6.
    Andrzejewski, D., Zhu, X., Craven, M.: Incorporating domain knowledge into topic modeling via Dirichlet forest priors. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 25–32 (2011)Google Scholar
  7. 7.
    Newman, D., Bonilla, E., Buntine, W.: Improving topic coherence with regularized topic models. In: Advances in Neural Information Processing Systems, pp. 496–504 (2011)Google Scholar
  8. 8.
    Xie, P., Yang, D., Xing, E.: Incorporating word correlation knowledge into topic modeling. In: Proceedings of NAACL 2015, pp. 725–734 (2015)Google Scholar
  9. 9.
    Wallach, H. Topic modeling: beyond bag-of-words. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 977–984 (2006)Google Scholar
  10. 10.
    Griffiths, T., Steyvers, M., Tenenbaum, J.: Topics in semantic representation. Psychol. Rev. 114(2), 211–244 (2007)CrossRefGoogle Scholar
  11. 11.
    Wang, X., McCallum, A., Wei, X.: Topical N-Grams: phrase and topic discovery, with an application to information retrieval. In: Proceedings of the 2007 Seventh IEEE International Conference on Data Mining, pp. 697–702 (2007)Google Scholar
  12. 12.
    Lau, J., Baldwin, T., Newman, D.: On collocations and topic models. ACM Trans. Speech Lang. Process. 10(3), 1–14 (2013)CrossRefGoogle Scholar
  13. 13.
    Nokel, M., Loukachevitch, N.: A method of accounting bigrams in topic models. In: Proceedings of the 11th Workshop on Multiword Expressions (2015)Google Scholar
  14. 14.
    Nokel, M., Loukachevitch, N.: Accounting ngrams and multi-word terms can improve topic models. In: Proceedings of the 11th Workshop on Multiword Expressions (2016)Google Scholar
  15. 15.
    Mimno, D., Wallach, H., Talley, E., Leenders, M., McCallum, A.: Optimizing semantic coherence in topic models. In: Proceedings of EMNLP 2011, pp. 262–272 (2011)Google Scholar
  16. 16.
    Lau, J., Newman, D., Baldwin, T.: Machine reading tea leaves: automatically evaluating topic coherence and topic model quality. In: Proceedings of the European Chapter of the Association for Computational Linguistics (2014)Google Scholar
  17. 17.
    Frantzi, K., Ananiadou, S.: The C-value/NC-value domain-independent method for multi-word term extraction. J. Nat. Lang. Process. 6(3), 145–179 (1999)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Lomonosov Moscow State UniversityMoscowRussia
  2. 2.YandexMoscowRussia

Personalised recommendations