The Utility of Feature Reuse: Transfer Learning in Data-Starved Regimes

Transfer of learning Feature (linguistics) Training set Feature Learning
DOI: 10.48550/arxiv.2003.04117 Publication Date: 2020-01-01
ABSTRACT
The use of transfer learning with deep neural networks has increasingly become widespread for deploying well-tested computer vision systems to newer domains, especially those limited datasets. We describe a case domain data-starved regime, having fewer than 100 labeled target samples. evaluate the effectiveness convolutional feature extraction and fine-tuning overparameterized models respect size training data, as well their generalization performance on data covariate shift, or out-of-distribution (OOD) data. Our experiments demonstrate that both overparameterization reuse contribute successful application in image classifiers regimes. provide visual explanations support our findings conclude enhances CNN architectures
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....