News Release

Exploring & exploiting high-order graph structure for sparse knowledge graph completion

Peer-Reviewed Publication

Higher Education Press

The proposed framework

image: 

The proposed framework

view more 

Credit: Tao HE, Ming LIU, Yixin CAO, Zekun WANG, Zihao ZHENG, Bing QIN

High sparse Knowledge Graph is a key challenge to solve the Knowledge Graph Completion task. Due to the sparsity of the KGs, there are not enough first-order neighbors to learn the features of entities, which results in a significant performance decrease in the KGC task.

To solve the problems, a research team led by Bing Qin published their new research on 15 Feb 2025 in Frontiers of Computer Science co-published by Higher Education Press and Springer Nature.

The team proposed a method to make use of higher-order neighbor subgraphs to compensate for the sparsity of the graphs. Compared with the existing methods, this method solves the problem from the perspective of mining higher-order neighbors for the first time.

In this study, they first use Reinforcement Learning to mine the query-related reasoning paths, and make full use of these reasoning paths as high-order graph structure.

To this end, they propose two strategies to utilize the higher-order structure during the training phase. Firstly, the mined reasoning path will be compressed into virtual edges to densify the sparse neighbor information, and the relation of each virtual edge is encoded by the relations sequence within the corresponding reasoning path, which can alleviate the impact of high sparsity issue on Graph Neural Network models. Secondly, the reasoning paths are generalized into rules and input into Markov Logic Network. Then the logic knowledge of reasoning rules is distilled from Markov Logic Network by variational EM algorithm to guide the Graph Neural Network model. These two strategies complement each other and the enhancement can be further stacked. Experimental results also show the effectiveness of the proposed framework on sparse KGs.

DOI: 10.1007/s11704-023-3521-y


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.