Knowl. Based Syst. | 2021

Knowledge graph embedding by translating in time domain space for link prediction

 
 
 
 

Abstract


Abstract Knowledge graph embedding, which aims to learn distributed representations of entities and relations, has been proven to be an effective method for predicting missing links in knowledge graphs. Existing knowledge graph embedding models mainly consider that the space where head and tail entities are located has the same properties. However head and tail entities can be different types of objects, and they should not be located in the vector space with the same properties. In this paper, we propose a novel knowledge graph embedding model called TimE, which presents each entry of the head (or tail) entity embeddings as a point in time domain space, and the corresponding entry of tail (or head) entity embeddings as a point in frequency domain space. Specifically, TimE defines each relation as a composition relation, which consists of a translation between entities and a diagonal projection matrix that projects the entities into the time domain space. In addition, we propose a cross-operation to model inverse and symmetric relations. Experimental results show that TimE not only outperforms existing state-of-the-art models on several large-scale benchmark datasets for the link prediction task, but also can better capture diversity distribution of entity embeddings for different relation patterns effectively, and can also model all relation patterns (including symmetry/antisymmetry, inversion, and composition).

Volume 212
Pages 106564
DOI 10.1016/j.knosys.2020.106564
Language English
Journal Knowl. Based Syst.

Full Text