首页 > 其他 > 详细

BIGGAN: LARGE SCALE GAN TRAINING FORHIGH FIDELITY NATURAL IMAGE SYNTHESIS

时间:2020-10-27 23:12:07      阅读:52      评论:0      收藏:0      [点我收藏+]

BackGround

  1. GAN的训练对其各方面的设置(参数、模型)都很敏感

     

Motivation

  1. 结合以外众多稳定GAN训练的方法来构建一个庞大的稳定的GAN

     

Model

技术分享图片

技术分享图片

Baseline:SAGAN[1]

 

Technique

  1. Hinge loss
  2. Class-conditional BatchNorm[1],BN后的系数γ、β由condition通过全连接(每个BN共用FC层)后提供
  3. Discriminator with projection[3]
  4. Weights decay of Generator
  5. Orthogonal Initialization
  6. Truncation trick

conclusion

用大量的技术去稳定了一个拥有大量参数的cGAN的训练

 

Reference

[1] Zhang, Han, Ian Goodfellow, Dimitris Metaxas, and Augustus Odena. "Self-Attention Generative Adversarial Networks." ArXiv:1805.08318 [Cs, Stat], June 14, 2019. http://arxiv.org/abs/1805.08318.

[2] Vries, Harm de, Florian Strub, Jérémie Mary, Hugo Larochelle, Olivier Pietquin, and Aaron Courville. "Modulating Early Visual Processing by Language." ArXiv:1707.00683 [Cs], December 18, 2017. http://arxiv.org/abs/1707.00683.

[3] Miyato, Takeru, and Masanori Koyama. "CGANs with Projection Discriminator." ArXiv:1802.05637 [Cs, Stat], August 14, 2018. http://arxiv.org/abs/1802.05637.

BIGGAN: LARGE SCALE GAN TRAINING FORHIGH FIDELITY NATURAL IMAGE SYNTHESIS

原文:https://www.cnblogs.com/JunzhaoLiang/p/13886458.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!