Approximations to the Fisher Information Metric of Deep Generative Models for Out-Of-Distribution Detection
Generative model
Fisher information
DOI:
10.48550/arxiv.2403.01485
Publication Date:
2024-03-03
AUTHORS (4)
ABSTRACT
Likelihood-based deep generative models such as score-based diffusion and variational autoencoders are state-of-the-art machine learning approximating high-dimensional distributions of data images, text, or audio. One many downstream tasks they can be naturally applied to is out-of-distribution (OOD) detection. However, seminal work by Nalisnick et al. which we reproduce showed that consistently infer higher log-likelihoods for OOD than were trained on, marking an open problem. In this work, analyse using the gradient a point with respect parameters model detection, based on simple intuition should have larger norms training data. We formalise measuring size Fisher information metric. show matrix (FIM) has large absolute diagonal values, motivating use chi-square distributed, layer-wise features. combine these features make simple, model-agnostic hyperparameter-free method detection estimates joint density given point. find weakly correlated, rendering their combined usage informative, prove satisfy principle (data representation) invariance. Our empirical results indicate outperforms Typicality test most image dataset pairings.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....