Neural Operator: Graph Kernel Network for Partial Differential Equations

Kernel (algebra) Operator (biology)
DOI: 10.48550/arxiv.2003.03485 Publication Date: 2020-01-01
ABSTRACT
The classical development of neural networks has been primarily for mappings between a finite-dimensional Euclidean space and set classes, or two spaces. purpose this work is to generalize so that they can learn infinite-dimensional spaces (operators). key innovation in our single network parameters, within carefully designed architecture, may be used describe different approximations those We formulate approximation the mapping by composing nonlinear activation functions class integral operators. kernel integration computed message passing on graph networks. This approach substantial practical consequences which we will illustrate context input data partial differential equations (PDEs) their solutions. In context, such learned among methods PDE (such as finite difference element methods) corresponding underlying levels resolution discretization. Experiments confirm proposed does have desired properties show competitive performance compared state art solvers.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....