ESWC 2017: The Semantic Web pp 337-352

Combining Word and Entity Embeddings for Entity Linking

  • Jose G. Moreno
  • Romaric Besançon
  • Romain Beaumont
  • Eva D’hondt
  • Anne-Laure Ligozat
  • Sophie Rosset
  • Xavier Tannier
  • Brigitte Grau
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10249)

Abstract

The correct identification of the link between an entity mention in a text and a known entity in a large knowledge base is important in information retrieval or information extraction. The general approach for this task is to generate, for a given mention, a set of candidate entities from the base and, in a second step, determine which is the best one. This paper proposes a novel method for the second step which is based on the joint learning of embeddings for the words in the text and the entities in the knowledge base. By learning these embeddings in the same space we arrive at a more conceptually grounded model that can be used for candidate selection based on the surrounding context. The relative improvement of this approach is experimentally validated on a recent benchmark corpus from the TAC-EDL 2015 evaluation campaign.

Keywords

Entity Linking Linked data Natural language processing and information retrieval 

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Jose G. Moreno
    • 1
  • Romaric Besançon
    • 2
  • Romain Beaumont
    • 3
  • Eva D’hondt
    • 3
  • Anne-Laure Ligozat
    • 3
    • 4
  • Sophie Rosset
    • 3
  • Xavier Tannier
    • 3
    • 5
  • Brigitte Grau
    • 3
    • 4
  1. 1.Université Paul Sabatier, IRITToulouseFrance
  2. 2.CEA, LIST, Vision and Content Engineering LaboratoryGif-sur-YvetteFrance
  3. 3.LIMSI, CNRSUniversité Paris-SaclayOrsayFrance
  4. 4.ENSIIEÉvryFrance
  5. 5.Univ. Paris-SudOrsayFrance

Personalised recommendations