Calibrated bootstrap for uncertainty quantification in regression models

Uncertainty Quantification Ensemble Learning Sensitivity Analysis
DOI: 10.48550/arxiv.2105.13303 Publication Date: 2021-01-01
ABSTRACT
Obtaining accurate estimates of machine learning model uncertainties on newly predicted data is essential for understanding the accuracy and whether its predictions can be trusted. A common approach to such uncertainty quantification estimate variance from an ensemble models, which are often generated by generally applicable bootstrap method. In this work, we demonstrate that direct standard deviation not propose a calibration method dramatically improve accuracy. We effectiveness both synthetic physical datasets field Materials Science Engineering. The motivated applications in biological science but quite general should wide range regression models.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....