WGAN

A common adversarial loss function based on Wasserstein's distance

Released in: Wasserstein GAN

Contributor:

Summary

The authors introduce a new algorithm named WGAN, an alternative to traditional GAN training. In this new model, they show that WGAN can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches. Furthermore, the paper shows that the corresponding optimization problem is sound, and provides extensive theoretical work highlighting the deep connections to other distances between distributions.

2017

Year Released

Key Links & Stats

PyTorch-GAN

WGAN

Wasserstein GAN

@InProceedings{pmlr-v70-arjovsky17a, title = {{W}asserstein Generative Adversarial Networks}, author = {Martin Arjovsky and Soumith Chintala and L{\'e}on Bottou}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {214--223}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/arjovsky17a/arjovsky17a.pdf}, url = {https://proceedings.mlr.press/v70/arjovsky17a.html}, abstract = {We introduce a new algorithm named WGAN, an alternative to traditional GAN training. In this new model, we show that we can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches. Furthermore, we show that the corresponding optimization problem is sound, and provide extensive theoretical work highlighting the deep connections to different distances between distributions.} }

ML Tasks

  1. General
  2. Image Generation

ML Platform

  1. Pytorch

Modalities

  1. General

Verticals

  1. General

CG Platform

  1. Not Applicable

Related organizations

Courant Institute of Mathematical Sciences

Facebook AI Research