TSSuBERT: Tweet Stream Summarization Using BERT

Salience (neuroscience) Popularity Deep Neural Networks
DOI: 10.48550/arxiv.2106.08770 Publication Date: 2021-01-01
ABSTRACT
The development of deep neural networks and the emergence pre-trained language models such as BERT allow to increase performance on many NLP tasks. However, these do not meet same popularity for tweet summarization, which can probably be explained by lack existing collections training evaluation. Our contribution in this paper is twofold : (1) we introduce a large dataset Twitter event (2) propose model automatically summarize huge streams. This extractive combines an original way vocabulary frequency-based representations predict salience. An additional advantage that it adapts size output summary according input stream. We conducted experiments using two different collections, promising results are observed comparison with state-of-the-art baselines.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....