Neural Mean Discrepancy for Efficient Out-of-Distribution Detection
Normalization
Training set
DOI:
10.48550/arxiv.2104.11408
Publication Date:
2021-01-01
AUTHORS (6)
ABSTRACT
Various approaches have been proposed for out-of-distribution (OOD) detection by augmenting models, input examples, training sets, and optimization objectives. Deviating from existing work, we a simple hypothesis that standard off-the-shelf models may already contain sufficient information about the set distribution which can be leveraged reliable OOD detection. Our empirical study on validating this hypothesis, measures model activation's mean in-distribution (ID) mini-batches, surprisingly finds activation means of mini-batches consistently deviate more those data. In addition, data's computed offline efficiently or retrieved batch normalization layers as 'free lunch'. Based upon observation, propose novel metric called Neural Mean Discrepancy (NMD), compares neural examples Leveraging simplicity NMD, an efficient detector computes forward pass followed lightweight classifier. Extensive experiments show NMD outperforms state-of-the-art across multiple datasets architectures in terms both accuracy computational cost.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....