An Effective Domain Adaptive Post-Training Method for BERT in Response Selection
FOS: Computer and information sciences
Computer Science - Machine Learning
0209 industrial biotechnology
Computer Science - Computation and Language
02 engineering and technology
Computation and Language (cs.CL)
Machine Learning (cs.LG)
DOI:
10.21437/interspeech.2020-2153
Publication Date:
2020-10-27T09:22:11Z
AUTHORS (6)
ABSTRACT
We focus on multi-turn response selection in a retrieval-based dialog system.In this paper, we utilize the powerful pre-trained language model Bi-directional Encoder Representations from Transformer (BERT) for system and propose highly effective post-training method domain-specific corpus.Although BERT is easily adopted to various NLP tasks outperforms previous baselines of each task, it still has limitations if task corpus too focused certain domain.Posttraining (e.g., Ubuntu Corpus) helps train contextualized representations words that do not appear general English Wikipedia).Experimental results show our approach achieves new stateof-the-art two benchmarks (i.e., Corpus V1, Advising performance improvement by 5.9% 6% R10@1.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (30)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....