Advertisement

Enriching Topic Models with DBpedia

  • Alexandru TodorEmail author
  • Wojciech Lukasiewicz
  • Tara Athan
  • Adrian Paschke
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10033)

Abstract

Traditional Topic Modeling approaches only consider the words in the document. By using an entity-topic modeling approach and including background knowledge about the entities such as the occupation of persons, the location of organizations, the band of a musician etc., we can better cluster related documents together, and produce semantic topic models that can be represented in a knowledge base. In our approach we first reduce the text documents to a set of entities and then enrich this set with background knowledge from DBpedia. Topic modeling is performed on the enriched set of entities and various feature combinations are evaluated in order to determine the combination that achieves the best classification precision or perplexity compared to using word-based topic models alone.

Keywords

Topic Model Latent Dirichlet Allocation Feature Combination Cluster Accuracy Unique Entity 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

This work has been partially supported by the “InnoProfileTransfer Corporate Smart Content" project funded by the German Federal Ministry of Education and Research (BMBF) and the BMBF Innovation Initiative for the New German Länder - Entrepreneurial Regions.

References

  1. 1.
    Andrzejewski, D., Zhu, X., Craven, M.: Incorporating domain knowledge into topic modeling via Dirichlet Forest priors. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 25–32. ACM (2009)Google Scholar
  2. 2.
    Andrzejewski, D., Zhu, X., Craven, M., Recht, B.: A framework for incorporating general domain knowledge into latent dirichlet allocation using first-order logic. In: IJCAI Proceedings-International Joint Conference on Artificial Intelligence, vol. 22, p. 1171 (2011)Google Scholar
  3. 3.
    Auer, S., Bizer, C., Kobilarov, G., Lehmann, J., Cyganiak, R., Ives, Z.: DBpedia: a nucleus for a web of open data. In: Aberer, K., Choi, K.-S., Noy, N., Allemang, D., Lee, K.-I., Nixon, L., Golbeck, J., Mika, P., Maynard, D., Mizoguchi, R., Schreiber, G., Cudré-Mauroux, P. (eds.) ASWC/ISWC -2007. LNCS, vol. 4825, pp. 722–735. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-76298-0_52 CrossRefGoogle Scholar
  4. 4.
    Baker, C.F., Fillmore, C.J., Lowe, J.B.: The berkeley framenet project. In: Proceedings of the 36th Annual Meeting of the Association for Computational Linguistics and 17th International Conference on Computational Linguistics-Volume 1, pp. 86–90. Association for Computational Linguistics (1998)Google Scholar
  5. 5.
    Blei, D., Lafferty, J.: Correlated topic models. Adv. Neural Inf. Process. Syst. 18, 147 (2006)Google Scholar
  6. 6.
    Blei, D.M., Lafferty, J.D.: Dynamic topic models. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 113–120. ACM (2006)Google Scholar
  7. 7.
    David, M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)zbMATHGoogle Scholar
  8. 8.
    Chang, J., Blei, D.M.: Relational topic models for document networks. AIStats 9, 81–88 (2009)Google Scholar
  9. 9.
    Chang, J., Gerrish, S., Wang, C., Boyd-Graber, J.L., Blei, D.M.: Reading tea leaves: how humans interpret topic models. In: Advances in Neural Information Processing Systems, pp. 288–296 (2009)Google Scholar
  10. 10.
    Chemudugunta, C., Holloway, A., Smyth, P., Steyvers, M.: Modeling documents by combining semantic concepts with unsupervised statistical learning. In: Sheth, A., Staab, S., Dean, M., Paolucci, M., Maynard, D., Finin, T., Thirunarayan, K. (eds.) ISWC 2008. LNCS, vol. 5318, pp. 229–244. Springer, Heidelberg (2008). doi: 10.1007/978-3-540-88564-1_15 CrossRefGoogle Scholar
  11. 11.
    Daiber, J., Jakob, M., Hokamp, C., Mendes, P.N.: Improving efficiency and accuracy in multilingual entity extraction. In: Proceedings of the 9th International Conference on Semantic Systems (I-Semantics) (2013)Google Scholar
  12. 12.
    Gabrilovich, E., Markovitch, S.: Wikipedia-based semantic interpretation for natural language processing. J. Artif. Intell. Res. 34, 443–498 (2009)zbMATHGoogle Scholar
  13. 13.
    Blei, D.M., Griffiths, T.L., Jordan, M.I., Tenenbaum, J.B.: Hierarchical topic models and the nested chinese restaurant process. Adv. Neural Inf. Process. Syst. 16, 17 (2004)Google Scholar
  14. 14.
    Griffiths, T.L., Steyvers, M., Finding scientific topics. Proc. Natl. Acad. Sci. U.S.A. 101, 5228–5235 (2004)Google Scholar
  15. 15.
    Hu, Z., Luo, G., Sachan, M., Xing, E., Nie, Z.: Grounding topic models with knowledge basesGoogle Scholar
  16. 16.
    Kliegr, T.: Linked hypernyms: enriching dbpedia with targeted hypernym discovery. Web Seman. Sci. Serv. Agents World Wide Web 31, 59–69 (2015)CrossRefGoogle Scholar
  17. 17.
    Movshovitz-Attias, D., Cohen, W.W.: KB-LDA: Jointly learning a knowledge base of hierarchy, relations, and facts. In: Proceedings of ACL (2015)Google Scholar
  18. 18.
    Newman, D., Chemudugunta, C., Smyth, P.: Statistical entity-topic models. In: Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 680–686. ACM (2006)Google Scholar
  19. 19.
    Hossein, S., Miller, D.J.: Parsimonious topic models with salient word discovery. IEEE Trans. Knowl. Data Eng. 27(3), 824–837 (2015)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Alexandru Todor
    • 1
    Email author
  • Wojciech Lukasiewicz
    • 1
  • Tara Athan
    • 1
  • Adrian Paschke
    • 1
  1. 1.Institute for Computer ScienceFreie Universität Berlin, AG Corporate Semantic WebBerlinGermany

Personalised recommendations