PLACE Dropout: A Progressive Layer-wise and Channel-wise Dropout for Domain Generalization

Overfitting Dropout (neural networks) Leverage (statistics) Regularization Robustness Benchmark (surveying)
DOI: 10.1145/3624015 Publication Date: 2023-09-13T12:10:58Z
ABSTRACT
Domain generalization (DG) aims to learn a generic model from multiple observed source domains that generalizes well arbitrary unseen target without further training. The major challenge in DG is the inevitably faces severe overfitting issue due domain gap between and domains. To mitigate this problem, some dropout-based methods have been proposed resist by discarding part of representation intermediate layers. However, we observe most these only conduct dropout operation specific layers, leading an insufficient regularization effect on model. We argue applying at layers can produce stronger effects, which could alleviate problem more adequately than previous layer-specific methods. In article, develop novel layer-wise channel-wise for DG, randomly selects one layer then its channels dropout. Particularly, method generate variety data variants better deal with issue. also provide theoretical analysis our prove it effectively reduce error bound. Besides, leverage progressive scheme increase ratio training progress, gradually boost difficulty enhance robustness. Extensive experiments three standard benchmark datasets demonstrated outperforms several state-of-the-art Our code available https://github.com/lingeringlight/PLACEdropout .
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (68)
CITATIONS (4)