A Modulation Layer to Increase Neural Network Robustness Against Data Quality Issues
Robustness
Imputation (statistics)
DOI:
10.48550/arxiv.2107.08574
Publication Date:
2021-01-01
AUTHORS (7)
ABSTRACT
Data missingness and quality are common problems in machine learning, especially for high-stakes applications such as healthcare. Developers often train learning models on carefully curated datasets using only high data; however, this reduces the utility of production environments. We propose a novel neural network modification to mitigate impacts low missing data which involves replacing fixed weights fully-connected layer with function an additional input. This is inspired from neuromodulation biological networks where cortex can up- down-regulate inputs based their reliability presence other data. In testing, scores modulating signal, layers were found be more robust against degradation quality, including missingness. These superior imputation they save training time by completely skipping process further allow introduction measures that cannot handle. Our results suggest explicitly accounting reduced information fully connected enable deployment artificial intelligence systems real-time applications.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....