Adaptive knowledge distillation based structure-text embedding integrating for knowledge graph completion
Chao (Conductor) Li, You Lv, Xiaolong Wei, Lianqiu Wei, Jianguang Zhang
PLoS ONE
Problems Identified (4)
Structure-text integration for KGC: Knowledge graph completion requires effective integration of structural and description information to improve downstream utility and address complementary weaknesses.
Multi-semantic entity transfer gap: Existing embedding-level coupling approaches do not transfer multi-semantic entity knowledge learned from structure models to PLMs, limiting integration quality.
Structure-text integration for KGC: Knowledge graph completion requires effective integration of structural and description information to improve downstream utility and address complementary weaknesses.
Multi-semantic entity transfer gap: Existing embedding-level coupling approaches do not transfer multi-semantic entity knowledge learned from structure models to PLMs, limiting integration quality.
Proposed Solutions (4)
AKD-KGC: AKD-KGC adds an Adaptive Knowledge Distillation teaching-learning procedure during feature integration to transfer structural multi-semantic knowledge and enhance structure-text integration for KGC.
Structural-model-guided integration: The framework integrates structural and textual features at the embedding level while using structural models to guide the integration model’s prediction behavior and supervise PLM weighting.
AKD-KGC: AKD-KGC adds an Adaptive Knowledge Distillation teaching-learning procedure during feature integration to transfer structural multi-semantic knowledge and enhance structure-text integration for KGC.
Structural-model-guided integration: The framework integrates structural and textual features at the embedding level while using structural models to guide the integration model’s prediction behavior and supervise PLM weighting.
Results (3)
State-of-the-art KGC performance:
Code and datasets released:
State-of-the-art KGC performance:
Research Domain
Knowledge graph completion; knowledge graph embeddings; knowledge distillation