BERTabaporu: Assessing a Genre-specific Language Model for Portuguese NLP
DOI:
10.26615/978-954-452-092-2_024
Publication Date:
2023-11-14T15:33:46Z
AUTHORS (10)
ABSTRACT
Transformer-based language models such as Bidirectional Encoder Representations from Transformers (BERT) are now mainstream in the NLP field, but extensions to languages other than English, new domains and/or more specific text genres still demand.In this paper we introduced BERTabaporu, a BERT model that has been pre-trained on Twitter data Brazilian Portuguese language.The is shown outperform best-known general-purpose for three Twitter-related tasks, making potentially useful resource general.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (8)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....