Robust and Provably Monotonic Networks
Robustness
Normalization
DOI:
10.48550/arxiv.2112.00038
Publication Date:
2021-01-01
AUTHORS (3)
ABSTRACT
The Lipschitz constant of the map between input and output space represented by a neural network is natural metric for assessing robustness model. We present new method to constrain dense deep learning models that can also be generalized other architectures. relies on simple weight normalization scheme during training ensures every layer below an upper limit specified analyst. A monotonic residual connection then used make model in any subset its inputs, which useful scenarios where domain knowledge dictates such dependence. Examples found algorithmic fairness requirements or, as presented here, classification decays subatomic particles produced at CERN Large Hadron Collider. Our minimally constraining allows underlying architecture maintain higher expressiveness compared techniques aim either control or ensure monotonicity. show how algorithm was train powerful, robust, interpretable discriminator heavy-flavor-quark decays, has been adopted use primary data-selection LHCb real-time data-processing system current LHC data-taking period known Run 3. In addition, our achieved state-of-the-art performance benchmarks medicine, finance, applications.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....