Context-aware Natural Language Generation with Recurrent Neural Networks

Natural Language Generation
DOI: 10.48550/arxiv.1611.09900 Publication Date: 2016-01-01
ABSTRACT
This paper studied generating natural languages at particular contexts or situations. We proposed two novel approaches which encode the into a continuous semantic representation and then decode text sequences with recurrent neural networks. During decoding, context information are attended through gating mechanism, addressing problem of long-range dependency caused by lengthy sequences. evaluate effectiveness on user review data, in rich available informative contexts, sentiments products, selected for evaluation. Experiments show that fake reviews generated our very natural. Results detection human judges more than 50\% misclassified as real reviews, 90\% existing state-of-the-art algorithm.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....