A

Relationship extraction
DOI: 10.18653/v1/w19-1908 Publication Date: 2019-07-21T17:37:55Z
ABSTRACT
Classic methods for clinical temporal relation extraction focus on relational candidates within a sentence. On the other hand, break-through Bidirectional Encoder Representations from Transformers (BERT) are trained large quantities of arbitrary spans contiguous text instead sentences. In this study, we aim to build sentence-agnostic framework task CONTAINS extraction. We establish new state-of-the-art result task, 0.684F in-domain (0.055-point improvement) and 0.565F cross-domain (0.018-point improvement), by fine-tuning BERT pre-training domain-specific models instances with WordPiece-compatible encodings, augmenting labeled data automatically generated “silver” instances.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (43)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....