Learning Cycle-Consistent Cooperative Networks via Alternating MCMC Teaching for Unsupervised Cross-Domain Translation

Generative model Image translation
DOI: 10.1609/aaai.v35i12.17249 Publication Date: 2022-09-08T19:39:43Z
ABSTRACT
This paper studies the unsupervised cross-domain translation problem by proposing a generative framework, in which probability distribution of each domain is represented cooperative network that consists an energy-based model and latent variable model. The use enables maximum likelihood learning MCMC teaching, where seeks to fit data distills its knowledge via MCMC. Specifically, teaching process, parameterized encoder-decoder maps examples from source target domain, while further refines mapped results Langevin revision such revised match terms statistical properties, are defined learned energy function. For purpose building up correspondence between two unpaired domains, proposed framework simultaneously learns pair networks with cycle consistency, accounting for two-way alternating teaching. Experiments show useful image-to-image image sequence translation.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (4)