2019 International Joint Conference on Neural Networks (IJCNN) | 2019

Enhance knowledge graph embedding via fake triples

 
 
 
 

Abstract


Embedding knowledge graphs (KGs) into continuous vector spaces is a focus of current research. Although previous works achieve great success, most of them are based on the closed-world assumption, where they only utilize knowledge explicitly exist in KGs but ignore implicit knowledge. This paper tries to improve the performance of KG embedding models by adding implicit knowledge to it. We consider that when an entity occurs only in head or tail of all triples in one KG, its connectivity with other entities is single and the inverse triples of it can provide as implicit knowledge to enrich its connectivity, and with this implicit knowledge KG embedding models can learn more accurate semantic of entities and relations in the KG. For this purpose, we introduce ‘fake triples’ (triples that theoretically should exist but not appear in knowledge graph explicitly) via dummy relations for zero in-degree and zero out-degree entities to enrich their connectivity and further improve the embedding models’ performance. Extensive experiments on entity alignment task and linking predication task show that our approach achieves good results. On entity alignment task, the entity alignment model with fake triples (EA+F) obtains better results than a number of state-of-the-art entity alignment models. On linking predication task, our method achieves better Mean Rank on FB15K. Our work can prove that the performance of knowledge graph embedding may be promoted by elaborate analysis on dataset rather than designing complex models.

Volume None
Pages 1-7
DOI 10.1109/IJCNN.2019.8852374
Language English
Journal 2019 International Joint Conference on Neural Networks (IJCNN)

Full Text