Dialogue State Distillation Network with Inter-slot Contrastive Learning for Dialogue State Tracking
Tracking (education)
DOI:
10.1609/aaai.v37i11.26620
Publication Date:
2023-06-27T18:19:00Z
AUTHORS (8)
ABSTRACT
In task-oriented dialogue systems, Dialogue State Tracking (DST) aims to extract users' intentions from the history. Currently, most existing approaches suffer error propagation and are unable dynamically select relevant information when utilizing previous states. Moreover, relations between updates of different slots provide vital clues for DST. However, rely only on predefined graphs indirectly capture relations. this paper, we propose a Distillation Network (DSDN) utilize states migrate gap utilization training testing. Thus, it can exploit avoid introducing simultaneously. Further, an inter-slot contrastive learning loss effectively slot co-update context. Experiments conducted widely used MultiWOZ 2.0 2.1 datasets. The experimental results show that our proposed model achieves state-of-the-art performance
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (4)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....