A lightweight knowledge reasoning method for large-scale knowledge graphs
Han ZHANG, Chengfang Zhang, B.L. Xu, Yuanbo Guo, Whenjun Zhou
The Journal of Supercomputing
Problems Identified (2)
GAT scalability limits for knowledge reasoning: Traditional Graph Attention Networks have limitations for knowledge reasoning on vast graph data and high-performance computing requirements.
GAT scalability limits for knowledge reasoning: Traditional Graph Attention Networks have limitations for knowledge reasoning on vast graph data and high-performance computing requirements.
Proposed Solutions (5)
lightweight GAT for knowledge reasoning: The paper proposes a lightweight GAT model for knowledge reasoning.
inverse-index subgraph construction: The method uses inverse index queries to select triples from a large knowledge base and construct a target subgraph for efficient node representation learning.
distillation and quantization lightweighting: The method integrates multi-level knowledge distillation and post-training parameter quantization to improve GAT computational efficiency while reducing parameters and memory usage.
lightweight GAT for knowledge reasoning: The paper proposes a lightweight GAT model for knowledge reasoning.
inverse-index subgraph construction: The method uses inverse index queries to select triples from a large knowledge base and construct a target subgraph for efficient node representation learning.
Results (3)
improved real-time computing efficiency:
superior benchmark performance:
reduced model size and memory usage:
Research Domain
knowledge graph reasoning; graph neural networks