Confidential — Stefan Michaelcheck Only

An Interpretable Few-Shot Text Classification Model Based on Graph Neural Networks and Knowledge Graphs

2026model innovationnovelmethod

子璇 周

Computer Science and Application

https://doi.org/10.12677/csa.2026.161010OpenAlex: W7123884091
1
URLs Found
0
Internal Citations
1
Authors
usable
Abstract Quality
GPT-5.5 Abstract Analysis

Problems Identified (4)

few-shot data scarcity: Existing few-shot text classification methods face a data scarcity challenge that creates a data bottleneck in few-shot learning.

insufficient model interpretability: Existing few-shot text classification methods suffer from insufficient interpretability for classification decisions.

few-shot data scarcity: Existing few-shot text classification methods face a data scarcity challenge that creates a data bottleneck in few-shot learning.

insufficient model interpretability: Existing few-shot text classification methods suffer from insufficient interpretability for classification decisions.

Proposed Solutions (5)

ARExplainer: The paper proposes ARExplainer, an interpretable few-shot text classification method using data and reasoning augmentation.

LLM-based sample augmentation: The method uses LLM generalization capability to expand the diversity of training samples.

knowledge-graph reasoning engine: The method constructs a knowledge graph-driven reasoning engine that combines a Graph Attention Network to extract verifiable symbolic reasoning paths.

prompt-based explanation generation: The method uses a prompt-based explanation generator to produce concise and clear natural language explanations.

ARExplainer: The paper proposes ARExplainer, an interpretable few-shot text classification method using data and reasoning augmentation.

Results (3)

improved 1-shot classification performance:

more human-understandable explanations:

improved 1-shot classification performance:

Research Domain

Few-shot text classification / interpretable NLP

← Back to all papers