Ontology-Guided, Hybrid Prompt Learning for Generalization in Knowledge Graph Question Answering
Knowledge graph
DOI:
10.48550/arxiv.2502.03992
Publication Date:
2025-02-06
AUTHORS (4)
ABSTRACT
Most existing Knowledge Graph Question Answering (KGQA) approaches are designed for a specific KG, such as Wikidata, DBpedia or Freebase. Due to the heterogeneity of underlying graph schema, topology and assertions, most KGQA systems cannot be transferred unseen Graphs (KGs) without resource-intensive training data. We present OntoSCPrompt, novel Large Language Model (LLM)-based approach with two-stage architecture that separates semantic parsing from KG-dependent interactions. OntoSCPrompt first generates SPARQL query structure (including keywords SELECT, ASK, WHERE placeholders missing tokens) then fills them KG-specific information. To enhance understanding we an ontology-guided, hybrid prompt learning strategy integrates KG ontology into process prompts (e.g., discrete continuous vectors). also several task-specific decoding strategies ensure correctness executability generated queries in both stages. Experimental results demonstrate performs well SOTA retraining on number datasets CWQ, WebQSP LC-QuAD 1.0 resource-efficient manner can generalize domain-specific KGs like DBLP-QuAD CoyPu Code: \href{https://github.com/LongquanJiang/OntoSCPrompt}{https://github.com/LongquanJiang/OntoSCPrompt}
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....