HYPO: Hyperspherical Out-of-Distribution Generalization
FOS: Computer and information sciences
Computer Science - Machine Learning
Machine Learning (cs.LG)
DOI:
10.48550/arxiv.2402.07785
Publication Date:
2024-02-12
AUTHORS (4)
ABSTRACT
Out-of-distribution (OOD) generalization is critical for machine learning models deployed in the real world. However, achieving this can be fundamentally challenging, as it requires ability to learn invariant features across different domains or environments. In paper, we propose a novel framework HYPO (HYPerspherical OOD generalization) that provably learns domain-invariant representations hyperspherical space. particular, our algorithm guided by intra-class variation and inter-class separation principles -- ensuring from same class (across training domains) are closely aligned with their prototypes, while prototypes maximally separated. We further provide theoretical justifications on how prototypical objective improves bound. Through extensive experiments challenging benchmarks, demonstrate approach outperforms competitive baselines achieves superior performance. Code available at https://github.com/deeplearning-wisc/hypo.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....