GraphText: Graph Reasoning in Text Space

Natural language understanding Graph traversal
DOI: 10.48550/arxiv.2310.01089 Publication Date: 2023-01-01
ABSTRACT
Large Language Models (LLMs) have gained the ability to assimilate human knowledge and facilitate natural language interactions with both humans other LLMs. However, despite their impressive achievements, LLMs not made significant advancements in realm of graph machine learning. This limitation arises because graphs encapsulate distinct relational data, making it challenging transform them into that understand. In this paper, we bridge gap a novel framework, GraphText, translates language. GraphText derives graph-syntax tree for each encapsulates node attributes inter-node relationships. Traversal yields text sequence, which is then processed by an LLM treat tasks as generation tasks. Notably, offers multiple advantages. It introduces training-free reasoning: even without training on ChatGPT can achieve par with, or surpassing, performance supervised-trained neural networks through in-context learning (ICL). Furthermore, paves way interactive reasoning, allowing communicate model seamlessly using These capabilities underscore vast, yet-to-be-explored potential domain
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....