Confidential — Stefan Michaelcheck Only

A lightweight knowledge reasoning method for large-scale knowledge graphs

2026model innovationincrementalmethod

Han ZHANG, Chengfang Zhang, B.L. Xu, Yuanbo Guo, Whenjun Zhou

The Journal of Supercomputing

https://doi.org/10.1007/s11227-026-08447-zOpenAlex: W7143712393
1
URLs Found
0
Internal Citations
5
Authors
usable
Abstract Quality
GPT-5.5 Abstract Analysis

Problems Identified (2)

GAT scalability limits for knowledge reasoning: Traditional Graph Attention Networks have limitations for knowledge reasoning on vast graph data and high-performance computing requirements.

GAT scalability limits for knowledge reasoning: Traditional Graph Attention Networks have limitations for knowledge reasoning on vast graph data and high-performance computing requirements.

Proposed Solutions (5)

lightweight GAT for knowledge reasoning: The paper proposes a lightweight GAT model for knowledge reasoning.

inverse-index subgraph construction: The method uses inverse index queries to select triples from a large knowledge base and construct a target subgraph for efficient node representation learning.

distillation and quantization lightweighting: The method integrates multi-level knowledge distillation and post-training parameter quantization to improve GAT computational efficiency while reducing parameters and memory usage.

lightweight GAT for knowledge reasoning: The paper proposes a lightweight GAT model for knowledge reasoning.

inverse-index subgraph construction: The method uses inverse index queries to select triples from a large knowledge base and construct a target subgraph for efficient node representation learning.

Results (3)

improved real-time computing efficiency:

superior benchmark performance:

reduced model size and memory usage:

Research Domain

knowledge graph reasoning; graph neural networks

← Back to all papers