On the approximation of functions by tanh neural networks
Activation function
Function Approximation
DOI:
10.1016/j.neunet.2021.08.015
Publication Date:
2021-08-19T17:47:18Z
AUTHORS (3)
ABSTRACT
We derive bounds on the error, in high-order Sobolev norms, incurred approximation of Sobolev-regular as well analytic functions by neural networks with hyperbolic tangent activation function. These provide explicit estimates error respect to size networks. show that tanh only two hidden layers suffice approximate at comparable or better rates than much deeper ReLU
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (75)
CITATIONS (88)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....