Kernel embedding of maps for sequential Bayesian inference: The variational mapping particle filter

Kernel (algebra)
DOI: 10.48550/arxiv.1805.11380 Publication Date: 2018-01-01
ABSTRACT
In this work, a novel sequential Monte Carlo filter is introduced which aims at efficient sampling of high-dimensional state spaces with limited number particles. Particles are pushed forward from the prior to posterior density using sequence mappings that minimizes Kullback-Leibler divergence between and intermediate densities. The represents gradient flow. A key ingredient they embedded in reproducing kernel Hilbert space, allows for practical algorithm. embedding provides direct means calculate leading quick convergence well-known gradient-based stochastic optimization algorithms. Evaluation method conducted chaotic Lorenz-63 system, Lorenz-96 coarse prototype atmospheric dynamics, an epidemic model describes cholera dynamics. No resampling required mapping particle even long recursive sequences. effective particles remains close total all experiments.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....