Slowly Expanding Neural Network for Class Incremental Learning
DOI:
10.2139/ssrn.4659403
Publication Date:
2023-12-09T19:17:50Z
AUTHORS (5)
ABSTRACT
Currently deep learning models often rapidly forget the knowledge of old classes when they are continually updated to learn new classes. To alleviate such catastrophic forgetting issues, state-of-the-art approach freezes learned feature extractor preserve and introduces an additional network each time. The issue can be effectively handled by this method at price rapid model expansion. In paper, we propose a novel continual framework, called SEIL, with much Slower Expansion rate better Incremental Learning performance than its counterparts. Specifically, instead introducing entire extractor, majority parameters shared during learning, only remaining lightweight modules expanded for task. An outof-the-distribution (OOD) technique is also applied two auxiliary classifiers more class-balanced predictions. proposed achieves state-of-theart results on multiple benchmark datasets under standard settings. source code will released publicly.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (46)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....