The collaborative knowledge graphs such as Wikidata excessively rely on the crowd to author the information. Since the crowd is not bound to a standard protocol for assigning entity titles, the knowledge graph is populated by non-standard, noisy, long or even sometimes awkward titles. The issue of long, implicit, and nonstandard entity representations is a challenge in Entity Linking (EL) approaches for gaining high precision and recall. Underlying KG, in general, is the source of target entities for EL approaches, however, it often contains other relevant information, such as aliases of entities (e.g., Obama and Barack Hussein Obama are alia…(more)
Please log in to take part in the discussion (add own reviews or comments).
Cite this publication
More citation styles
- please select -
%0 Journal Article
%1 Mulang2019-xa
%A Mulang, Isaiah Onando
%A Singh, Kuldeep
%A Vyas, Akhilesh
%A Shekarpour, Saeedeh
%A Vidal, Maria Esther
%A Lehmann, Jens
%A Auer, Soren
%D 2019
%I arXiv
%K
%T Encoding knowledge graph entity aliases in attentive neural network for Wikidata Entity Linking
%X The collaborative knowledge graphs such as Wikidata excessively rely on the crowd to author the information. Since the crowd is not bound to a standard protocol for assigning entity titles, the knowledge graph is populated by non-standard, noisy, long or even sometimes awkward titles. The issue of long, implicit, and nonstandard entity representations is a challenge in Entity Linking (EL) approaches for gaining high precision and recall. Underlying KG, in general, is the source of target entities for EL approaches, however, it often contains other relevant information, such as aliases of entities (e.g., Obama and Barack Hussein Obama are aliases for the entity Barack Obama). EL models usually ignore such readily available entity attributes. In this paper, we examine the role of knowledge graph context on an attentive neural network approach for entity linking on Wikidata. Our approach contributes by exploiting the sufficient context from a KG as a source of background knowledge, which is then fed into the neural network. This approach demonstrates merit to address challenges associated with entity titles (multi-word, long, implicit, case-sensitive). Our experimental study shows approx 8\% improvements over the baseline approach, and significantly outperform an end to end approach for Wikidata entity linking.
@article{Mulang2019-xa,
abstract = {The collaborative knowledge graphs such as Wikidata excessively rely on the crowd to author the information. Since the crowd is not bound to a standard protocol for assigning entity titles, the knowledge graph is populated by non-standard, noisy, long or even sometimes awkward titles. The issue of long, implicit, and nonstandard entity representations is a challenge in Entity Linking (EL) approaches for gaining high precision and recall. Underlying KG, in general, is the source of target entities for EL approaches, however, it often contains other relevant information, such as aliases of entities (e.g., Obama and Barack Hussein Obama are aliases for the entity Barack Obama). EL models usually ignore such readily available entity attributes. In this paper, we examine the role of knowledge graph context on an attentive neural network approach for entity linking on Wikidata. Our approach contributes by exploiting the sufficient context from a KG as a source of background knowledge, which is then fed into the neural network. This approach demonstrates merit to address challenges associated with entity titles (multi-word, long, implicit, case-sensitive). Our experimental study shows approx 8\% improvements over the baseline approach, and significantly outperform an end to end approach for Wikidata entity linking.},
added-at = {2024-09-10T11:56:37.000+0200},
author = {Mulang, Isaiah Onando and Singh, Kuldeep and Vyas, Akhilesh and Shekarpour, Saeedeh and Vidal, Maria Esther and Lehmann, Jens and Auer, Soren},
biburl = {https://puma.scadsai.uni-leipzig.de/bibtex/2230f556f41ec3a1b91e58dc1bb55679d/scadsfct},
interhash = {68ac3ef726797b2637432cb1dbe73eb1},
intrahash = {230f556f41ec3a1b91e58dc1bb55679d},
keywords = {},
publisher = {arXiv},
timestamp = {2024-09-10T15:15:57.000+0200},
title = {Encoding knowledge graph entity aliases in attentive neural network for Wikidata Entity Linking},
year = 2019
}