Quantifying Model Uncertainty in Inverse Problems via Bayesian Deep Gradient Descent

Leverage (statistics) Stochastic Gradient Descent
DOI: 10.48550/arxiv.2007.09971 Publication Date: 2020-01-01
ABSTRACT
Recent advances in reconstruction methods for inverse problems leverage powerful data-driven models, e.g., deep neural networks. These techniques have demonstrated state-of-the-art performances several imaging tasks, but they often do not provide uncertainty on the obtained reconstruction. In this work, we develop a scalable, data-driven, knowledge-aided computational framework to quantify model via Bayesian The approach builds on, and extends gradient descent, recently developed greedy iterative training scheme, recasts it within probabilistic framework. Scalability is achieved by being hybrid architecture: only last layer of each block Bayesian, while others remain deterministic, training. showcased one representative medical modality, viz. computed tomography with either sparse view or limited data, exhibits competitive performance respect benchmarks, total variation, descent learned primal-dual.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....