Simplify the Usage of Lexicon in Chinese NER
Named Entity Recognition
Benchmark (surveying)
Representation
DOI:
10.18653/v1/2020.acl-main.528
Publication Date:
2020-07-29T14:14:43Z
AUTHORS (5)
ABSTRACT
Recently, many works have tried to augment the performance of Chinese named entity recognition (NER) using word lexicons. As a representative, Lattice-LSTM has achieved new benchmark results on several public NER datasets. However, complex model architecture. This limits its application in industrial areas where real-time responses are needed. In this work, we propose simple but effective method for incorporating lexicon into character representations. avoids designing complicated sequence modeling architecture, and any neural model, it requires only subtle adjustment representation layer introduce information. Experimental studies four datasets show that our achieves an inference speed up 6.15 times faster than those state-of-the-art methods, along with better performance. The experimental also proposed can be easily incorporated pre-trained models like BERT.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (183)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....