Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner

Consolidation
DOI: 10.48550/arxiv.2406.03065 Publication Date: 2024-06-05
ABSTRACT
Instance-incremental learning (IIL) focuses on continually with data of the same classes. Compared to class-incremental (CIL), IIL is seldom explored because suffers less from catastrophic forgetting (CF). However, besides retaining knowledge, in real-world deployment scenarios where class space always predefined, continual and cost-effective model promotion potential unavailability previous a more essential demand. Therefore, we first define new practical setting as promoting model's performance resisting CF only observations. Two issues have be tackled setting: 1) notorious no access old data, 2) broadening existing decision boundary observations concept drift. To tackle these problems, our key insight moderately broaden fail cases while retain boundary. Hence, propose novel boundary-aware distillation method consolidating knowledge teacher ease student knowledge. We also establish benchmarks datasets Cifar-100 ImageNet. Notably, extensive experiments demonstrate that can better incremental learner than model, which overturns distillation-based methods treating main role.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....