ATOM: AdapTive and OptiMized dynamic temporal knowledge graph construction using LLMs
Association for Computational Linguistics 2026, Ludovic Moncla, Khalid Benabdeslem, Rémy Cazabet, Pierre Cléau, Yassir Lairgi
Underline Science Inc.
Problems Identified (5)
Static KG temporal inflexibility: Traditional static knowledge graph construction overlooks dynamic, time-sensitive real-world data and limits adaptability to continuous changes.
Few-shot KG extraction instability: Recent zero- or few-shot KG construction approaches can be unstable across multiple runs.
Incomplete fact coverage: Recent zero- or few-shot approaches can miss coverage of key facts.
Static KG temporal inflexibility: Traditional static knowledge graph construction overlooks dynamic, time-sensitive real-world data and limits adaptability to continuous changes.
Few-shot KG extraction instability: Recent zero- or few-shot KG construction approaches can be unstable across multiple runs.
Proposed Solutions (5)
ATOM TKG construction: ATOM is a few-shot scalable approach that builds and continuously updates temporal knowledge graphs from unstructured texts.
Atomic fact decomposition: ATOM splits input documents into minimal self-contained atomic facts to improve extraction exhaustivity and stability.
Parallel atomic KG merging: ATOM derives atomic KGs from atomic facts and merges them in parallel.
ATOM TKG construction: ATOM is a few-shot scalable approach that builds and continuously updates temporal knowledge graphs from unstructured texts.
Atomic fact decomposition: ATOM splits input documents into minimal self-contained atomic facts to improve extraction exhaustivity and stability.
Results (3)
High exhaustivity and stability:
Baseline outperformance:
Low-latency scalable deployment:
Research Domain
Temporal knowledge graph construction from unstructured text