Class-Distribution-Aware Calibration for Long-Tailed Visual Recognition

FOS: Computer and information sciences Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition 0202 electrical engineering, electronic engineering, information engineering 02 engineering and technology cs.CV
DOI: 10.48550/arxiv.2109.05263 Publication Date: 2021-01-01
ABSTRACT
Despite impressive accuracy, deep neural networks are often miscalibrated and tend to overly confident predictions. Recent techniques like temperature scaling (TS) label smoothing (LS) show effectiveness in obtaining a well-calibrated model by logits hard labels with scalar factors, respectively. However, the use of uniform TS or LS factor may not be optimal for calibrating models trained on long-tailed dataset where produces probabilities high-frequency classes. In this study, we propose class-distribution-aware (CDA-TS) (CDA-LS) incorporating class frequency information calibration context distribution. CDA-TS, value is replaced CDA vector encoded compensate over-confidence. Similarly, CDA-LS uses flattens according their corresponding We also integrate distillation loss, which reduces miscalibration self-distillation (SD). empirically that can accommodate imbalanced data distribution yielding superior performance both error predictive accuracy. observe SD an extremely less effective terms performance. Code available https://github.com/mobarakol/Class-Distribution-Aware-TS-LS.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....