Ensemble Neural Networks (ENN): A gradient-free stochastic method
Bayesian Optimization
DOI:
10.1016/j.neunet.2018.11.009
Publication Date:
2018-12-03T02:32:38Z
AUTHORS (4)
ABSTRACT
In this study, an efficient stochastic gradient-free method, the ensemble neural networks (ENN), is developed. ENN, optimization process relies on covariance matrices rather than derivatives. The are calculated by randomized maximum likelihood algorithm (EnRML), which inverse modeling method. ENN able to simultaneously provide estimations and perform uncertainty quantification since it built under Bayesian framework. also robust small training data size because of realizations essentially enlarges dataset. This constitutes a desirable characteristic, especially for real-world engineering applications. addition, does not require calculation gradients, enables use complicated neuron models loss functions in networks. We experimentally demonstrate benefits proposed model, particular showing that performs much better traditional (BNN). EnRML substitution gradient-based algorithms, means can be directly combined with feed-forward other existing (deep) networks, such as convolutional (CNN) recurrent (RNN), broadening future applications ENN.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (47)
CITATIONS (41)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....