Provable Sample-Efficient Transfer Learning Conditional Diffusion Models via Representation Learning
Transfer of learning
Representation
Sample (material)
DOI:
10.48550/arxiv.2502.04491
Publication Date:
2025-02-06
AUTHORS (4)
ABSTRACT
While conditional diffusion models have achieved remarkable success in various applications, they require abundant data to train from scratch, which is often infeasible practice. To address this issue, transfer learning has emerged as an essential paradigm small regimes. Despite its empirical success, the theoretical underpinnings of remain unexplored. In paper, we take first step towards understanding sample efficiency through lens representation learning. Inspired by practical training procedures, assume that there exists a low-dimensional conditions shared across all tasks. Our analysis shows with well-learned source tasks, samplecomplexity target tasks can be reduced substantially. addition, investigate implications our results several real-world applications models. Numerical experiments are also conducted verify results.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....