Generative Pretrained Hierarchical Transformer for Time Series Forecasting
Generative model
DOI:
10.1145/3637528.3671855
Publication Date:
2024-08-25T04:55:12Z
AUTHORS (5)
ABSTRACT
Recent efforts have been dedicated to enhancing time series forecasting accuracy by introducing advanced network architectures and self-supervised pretraining strategies. Nevertheless, existing approaches still exhibit two critical drawbacks. Firstly, these methods often rely on a single dataset for training, limiting the model's generalizability due restricted scale of training data. Secondly, one-step generation schema is widely followed, which necessitates customized head overlooks temporal dependencies in output series, also leads increased costs under different horizon length settings. To address issues, we propose novel generative pretrained hierarchical transformer architecture forecasting, named \textbf{GPHT}. There are aspects key designs GPHT. On one hand, advocate constructing mixed channel-independent assumption our model, comprising various datasets from diverse data scenarios. This approach significantly expands data, allowing model uncover commonalities facilitating improved transfer specific datasets. other GPHT employs an auto-regressive approach, effectively modeling series. Importantly, no required, enabling \textit{a forecast at arbitrary settings.} We conduct sufficient experiments eight with mainstream models supervised models. The results demonstrated that surpasses baseline across fine-tuning zero/few-shot learning settings traditional long-term task. make codes publicly available\footnote{https://github.com/icantnamemyself/GPHT}.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (54)
CITATIONS (5)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....