Incremental Scene Classification Using Dual Knowledge Distillation and Classifier Discrepancy on Natural and Remote Sensing Images

Contextual image classification
DOI: 10.3390/electronics13030583 Publication Date: 2024-01-31T14:56:34Z
ABSTRACT
Conventional deep neural networks face challenges in handling the increasing amount of information real-world scenarios where it is impractical to gather all training data at once. Incremental learning, also known as continual provides a solution for lightweight and sustainable learning with networks. However, incremental encounters issues such “catastrophic forgetting” “stability–plasticity dilemma”. To address these challenges, this study proposes two-stage method. In first stage, dual knowledge distillation introduced, including feature map-based response-based distillation. This approach prevents model from excessively favoring new tasks during training, thus addressing catastrophic forgetting. second an out-of-distribution dataset incorporated calculate discrepancy loss between multiple classifiers. By maximizing minimizing cross-entropy loss, improves classification accuracy tasks. The proposed method evaluated using CIFAR100 RESISC45 benchmark datasets, comparing existing approaches. Experimental results demonstrate overall improvement 6.9% reduction 5.1% forgetting rate after adding nine consecutive These findings indicate that effectively mitigates viable image natural remote sensing images.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (52)
CITATIONS (4)