Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning
Operator (biology)
DOI:
10.48550/arxiv.2402.15734
Publication Date:
2024-02-24
AUTHORS (6)
ABSTRACT
Recent years have witnessed the promise of coupling machine learning methods and physical domain-specific insight for solving scientific problems based on partial differential equations (PDEs). However, being data-intensive, these still require a large amount PDE data. This reintroduces need expensive numerical solutions, partially undermining original goal avoiding simulations. In this work, seeking data efficiency, we design unsupervised pretraining in-context operator learning. To reduce training with simulated pretrain neural operators unlabeled using reconstruction-based proxy tasks. improve out-of-distribution performance, further assist in flexibly leveraging methods, without incurring extra costs or designs. Extensive empirical evaluations diverse set PDEs demonstrate that our method is highly data-efficient, more generalizable, even outperforms conventional vision-pretrained models.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....