Using Large Language Models for Zero-Shot Natural Language Generation from Knowledge Graphs

Text generation Knowledge graph Natural language understanding Zero (linguistics) Natural Language Generation
DOI: 10.48550/arxiv.2307.07312 Publication Date: 2023-01-01
ABSTRACT
In any system that uses structured knowledge graph (KG) data as its underlying representation, KG-to-text generation is a useful tool for turning parts of the into text can be understood by humans. Recent work has shown models make use pretraining on large amounts perform well task even with relatively small sets training specific graph-to-text task. this paper, we build concept using language to zero-shot based nothing but model's understanding triple structure from what it read. We show ChatGPT achieves near state-of-the-art performance some measures WebNLG 2020 challenge, falls behind others. Additionally, compare factual, counter-factual and fictional statements, there significant connection between LLM already knows about parsing quality output text.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....