Interleaving Learning, with Application to Neural Architecture Search
Interleaving
Incremental Learning
Transfer of learning
DOI:
10.48550/arxiv.2103.07018
Publication Date:
2021-01-01
AUTHORS (2)
ABSTRACT
Interleaving learning is a human technique where learner interleaves the studies of multiple topics, which increases long-term retention and improves ability to transfer learned knowledge. Inspired by interleaving humans, in this paper we explore whether methodology beneficial for improving performance machine models as well. We propose novel framework referred (IL). In our framework, set collaboratively learn data encoder an fashion: trained model 1 while, then passed 2 further training, 3, so on; after all models, returns back again, moving 2, etc. This process repeats rounds. Our based on multi-level optimization consisting inter-connected stages. An efficient gradient-based algorithm developed solve problem. apply search neural architectures image classification CIFAR-10, CIFAR-100, ImageNet. The effectiveness method strongly demonstrated experimental results.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....