We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
SCGN: novel generative model using the convergence of latent space by training.
- Authors
Kim, H.; Jung, S.H.
- Abstract
Generative models such as variational autoencoders (VAEs) and generative adversarial networks (GANs) have been recently applied to various fields. However, the VAE and GAN models have blur and mode collapse problems, respectively. Here, the authors propose a novel generative model, self-converging generative network (SCGN), to address the issues. Self-converging means the convergence of latent vectors into themselves through being trained in pairs with training data, by which the SCGN can reconstruct all training data. In the authors' model, the latent vectors and weights of the generator are alternately trained. Specifically, the latent vectors are trained to follow a normal distribution, using a loss function derived from the Kullback–Leibler divergence and a pixel-wise loss. The weights of the generator are adjusted for the generator to produce training data by means of a pixel-wise loss. As a result, their SCGN did not fall into the mode collapse, which occurs in GANs, and made clearer images than VAEs thanks to no use of sampling. Moreover, the SCGN successfully learned the manifold of the dataset in the extensive experiments with CelebA.
- Subjects
GAUSSIAN distribution; COST functions; SPACE; LATENT variables; VECTOR autoregression model
- Publication
Electronics Letters (Wiley-Blackwell), 2020, Vol 56, Issue 17, p879
- ISSN
0013-5194
- Publication type
Article
- DOI
10.1049/el.2020.1333