5/17/2023 0 Comments Reflection transformation![]() Īs reported in and, the performance of existing distance-based KGE models are heavily influenced by their capability to model the mapping properties and connectivity patterns of relations. Another KGE family that applying deep neural networks also achieved promising successes, such as ConvE, ConvR. RESCAL and DistMult, which belong to bilinear-based family, regard each relation as a matrix and a diagonal matrix, respectively. For example, TransE and RotatE, which belong to distance-based family, interpret each relation as a translation distance and a rotation distance from head to tail, respectively. Existing conventional KGE methods usually interpret each relation as distance from head to tail or as a tensor decomposition, and define a scoring function to measure whether the given triplet is true or not. More recently, researchers have conducted extensive researches on KGE for link prediction task and many KGE models have been proposed, such as TransE, RESCAL, ConvE. Knowledge graph embedding(KGE), which aims to learn low dimensional representations for both relations and entities, is a powerful approach for link prediction. Link prediction is the most popular task for knowledge graph completion. Therefore, it has attracted extensive attention to complete knowledge graph by predicting new facts from existing ones in recent years. However, although the facts stored in the existing knowledge graphs have reached the scale of billions, knowledge graphs are still suffering incompleteness as it is impossible to list all the facts in the world. ![]() With the help of the powerful logical reasoning ability of knowledge graph, many AI-related fields have achieved promising successes, such as question answering, recommendation, zero-shot learning. In recent decades, numerous companies and institutions have built a great quantity of knowledge graphs, such as DBpedia, WordNet, Freebase and YAGO3 . In this paper, for simplicity, we abbreviate the triple as ( h, r, t). ![]() Experimental results show that, compared with conventional distance-based KGE models, ReflectE achieves SOTA results for link prediction.Ī knowledge graph usually stores large-scale factual tri-plets in the form of ( h e a d, r e l a t i o n, t a i l), where entities h e a d and t a i l represent the nodes in knowledge graph, r e l a t i o n represents the directed edge between the two entities. In order to evaluate the effectiveness of our proposed model ReflectE, we choose previous SOTA models as baselines and conduct link prediction task on three popular datasets. Furthermore, ReflectE models complex relations by learning a relation-specific dynamic reflection hyperplane. Therefore it can model symmetric and inverse relations by reflection transformation naturally. Specifically, ReflectE regards the tail entity(or head entity) in a triple as the reflection of the head entity(or tail entity) on a relation-specific hyperplane. ![]() In this paper, we propose a new KGE model called ReflectE, which regards each relation as a normal vector of a relation-specific reflection hyperplane. Despite existing KGE models achieved state-of-the-art(SOTA) performances, modeling and inferring relation connectivity(such as symmetry/antisymmetry, inversion, and composition), as well as complex relations prediction(such as M-to-1, 1-to-M and M-to-M) still have great challenges. KGE aims to learn low dimensional representations for both relations and entities. Due to the incompleteness of knowledge graph, knowledge graph embedding(KGE) has become a key technique for automatically predict missing facts in knowledge graph.
0 Comments
Leave a Reply. |