Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction
Relationship extraction
Biomedical text mining
Benchmark (surveying)
Domain Adaptation
Training set
DOI:
10.1186/s12859-022-04642-w
Publication Date:
2022-04-04T08:05:54Z
AUTHORS (2)
ABSTRACT
Recently, automatically extracting biomedical relations has been a significant subject in research due to the rapid growth of literature. Since adaptation domain, transformer-based BERT models have produced leading results on many natural language processing tasks. In this work, we will explore approaches improve model for relation extraction tasks both pre-training and fine-tuning stages its applications. stage, add another level sub-domain data bridge gap between domain knowledge task-specific knowledge. Also, propose methods incorporate ignored last layer fine-tuning.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (37)
CITATIONS (32)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....