Knowledge-augmented Self-training of A Question Rewriter for Conversational Knowledge Base Question Answering
Coreference
Ellipsis (linguistics)
Code (set theory)
DOI:
10.18653/v1/2022.findings-emnlp.133
Publication Date:
2023-08-04T20:21:02Z
AUTHORS (8)
ABSTRACT
The recent rise of conversational applications such as online customer service systems and intelligent personal assistants has promoted the development knowledge base question answering (ConvKBQA). Different from traditional single-turn KBQA, ConvKBQA usually explores multi-turn questions around a topic, where ellipsis coreference pose great challenges to KBQA which require self-contained questions. In this paper, we propose rewrite-and-reason framework first produce full-fledged rewritten based on conversation history then reason answer by existing models. To overcome absence supervision signals, introduce knowledge-augmented self-training mechanism transfer rewriter another dataset adapt current base. Our is decoupled subsequent QA process, makes it easy be united with either retrieval-based or semantic parsing-based Experiment results demonstrate effectiveness our method new state-of-the-art result achieved. code are available now.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (6)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....