Confidential — Stefan Michaelcheck Only

Adaptive knowledge distillation based structure-text embedding integrating for knowledge graph completion

2026model innovationincrementalmethod

Chao (Conductor) Li, You Lv, Xiaolong Wei, Lianqiu Wei, Jianguang Zhang

PLoS ONE

https://doi.org/10.1371/journal.pone.0344363OpenAlex: W7138853926
3
URLs Found
0
Internal Citations
5
Authors
usable
Abstract Quality
GPT-5.5 Abstract Analysis

Problems Identified (4)

Structure-text integration for KGC: Knowledge graph completion requires effective integration of structural and description information to improve downstream utility and address complementary weaknesses.

Multi-semantic entity transfer gap: Existing embedding-level coupling approaches do not transfer multi-semantic entity knowledge learned from structure models to PLMs, limiting integration quality.

Structure-text integration for KGC: Knowledge graph completion requires effective integration of structural and description information to improve downstream utility and address complementary weaknesses.

Multi-semantic entity transfer gap: Existing embedding-level coupling approaches do not transfer multi-semantic entity knowledge learned from structure models to PLMs, limiting integration quality.

Proposed Solutions (4)

AKD-KGC: AKD-KGC adds an Adaptive Knowledge Distillation teaching-learning procedure during feature integration to transfer structural multi-semantic knowledge and enhance structure-text integration for KGC.

Structural-model-guided integration: The framework integrates structural and textual features at the embedding level while using structural models to guide the integration model’s prediction behavior and supervise PLM weighting.

AKD-KGC: AKD-KGC adds an Adaptive Knowledge Distillation teaching-learning procedure during feature integration to transfer structural multi-semantic knowledge and enhance structure-text integration for KGC.

Structural-model-guided integration: The framework integrates structural and textual features at the embedding level while using structural models to guide the integration model’s prediction behavior and supervise PLM weighting.

Results (3)

State-of-the-art KGC performance:

Code and datasets released:

State-of-the-art KGC performance:

Research Domain

Knowledge graph completion; knowledge graph embeddings; knowledge distillation

← Back to all papers