A common adversarial loss function based on least squares

Released in: Least Squares Generative Adversarial Networks



Unsupervised learning with generative adversarial networks (GANs) has proven hugely successful. Regular GANs hypothesize the discriminator as a classifier with the sigmoid cross entropy loss function. However, the authors found that this loss function may lead to the vanishing gradients problem during the learning process. To overcome such a problem, this paper proposes the Least Squares Generative Adversarial Networks (LSGANs) which adopt the least squares loss function for the discriminator. The authors show that minimizing the objective function of LSGAN yields minimizing the Pearson χ2 divergence. There are two benefits of LSGANs over regular GANs. First, LSGANs are able to generate higher quality images than regular GANs. Second, LSGANs perform more stable during the learning process. The paper evaluates LSGANs on five scene datasets and the experimental results show that the images generated by LSGANs are of better quality than the ones generated by regular GANs. The authors also conduct two comparison experiments between LSGANs and regular GANs to illustrate the stability of LSGANs.


Year Released

Key Links & Stats



Least Squares Generative Adversarial Networks

@article{DBLP:journals/corr/MaoLXLW16, author = {Xudong Mao and Qing Li and Haoran Xie and Raymond Y. K. Lau and Zhen Wang}, title = {Multi-class Generative Adversarial Networks with the {L2} Loss Function}, journal = {CoRR}, volume = {abs/1611.04076}, year = {2016}, url = {}, eprinttype = {arXiv}, eprint = {1611.04076}, timestamp = {Wed, 13 Nov 2019 15:48:57 +0100}, biburl = {}, bibsource = {dblp computer science bibliography,} }

ML Tasks

  1. General
  2. Image Generation

ML Platform

  1. Pytorch


  1. General


  1. General

CG Platform

  1. Not Applicable

Related organizations

City University of Hong Kong

The Education University of Hong Kong