LIMP: Large Language Model Enhanced Intent-aware Mobility Prediction

Limp
DOI: 10.48550/arxiv.2408.12832 Publication Date: 2024-08-23
ABSTRACT
Human mobility prediction is essential for applications like urban planning and transportation management, yet it remains challenging due to the complex, often implicit, intentions behind human behavior. Existing models predominantly focus on spatiotemporal patterns, paying less attention underlying that govern movements. Recent advancements in large language (LLMs) offer a promising alternative research angle integrating commonsense reasoning into prediction. However, non-trivial problem because LLMs are not natively built intention inference, they also face scalability issues integration difficulties with models. To address these challenges, we propose novel LIMP (LLMs Intent-ware Mobility Prediction) framework. Specifically, introduces an "Analyze-Abstract-Infer" (A2I) agentic workflow unleash LLM's power inference. Besides, design efficient fine-tuning scheme transfer from commercial LLM smaller-scale, open-source model, ensuring LIMP's millions of records. Moreover, transformer-based intention-aware model effectively harness inference ability LLM. Evaluated two real-world datasets, significantly outperforms baseline models, demonstrating improved accuracy next-location effective The interpretability highlights our framework's potential applications. Codes data can be found https://github.com/tsinghua-fib-lab/LIMP .
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....