Simultaneous Gradient Descent-Ascent for GANs Minimax Optimization using Sinkhorn Divergence

Risman Adnan, Muchlisin Adi Saputra, Junaidillah Fadlil, Muhamad Iqbal, Tjan Basaruddin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The Sinkhorn divergence, a smooth and symmetric normalization version of entropy-regularized optimal transport (EOT) is a promising tool for Generative Adversarial Networks (GANs). However, understanding the dynamic of gradient algorithms for Sinkhorn-based GANs remains a big challenge. In this work, we consider the GANs minimax optimization problem using Sinkhorn divergence, in which smoothness and convexity properties of the objective function are critical factors for convergence and stability. We prove that GANs with convex-concave Sinkhorn divergence can converge to local Nash equilibrium using first-order simultaneous stochastic gradient descent-ascent (SimSGDA) algorithm under certain approximations. We further present a nonasymptotic analysis for the convergence rate of the SimSGDA using structural similarity index measure (SSIM). Our experiments suggest a convergence rate proportional to the inverse number of SGDA iterations tested on tiny-colored datasets (Cats and CelebA) and advanced neural architectures (DCGAN and ResNet). We demonstrate that SSIM is potential tool to measure convergence rate of the SimSGDA algorithm empirically.

Original languageEnglish
Title of host publication2020 2nd International Conference on Artificial Intelligence, Robotics and Control, AIRC 2020
PublisherAssociation for Computing Machinery
Pages6-17
Number of pages12
ISBN (Electronic)9781450389266
DOIs
Publication statusPublished - 12 Dec 2020
Event2nd International Conference on Artificial Intelligence, Robotics and Control, AIRC 2020 - Cairo, Egypt
Duration: 12 Dec 202014 Dec 2020

Publication series

NameACM International Conference Proceeding Series

Conference

Conference2nd International Conference on Artificial Intelligence, Robotics and Control, AIRC 2020
Country/TerritoryEgypt
CityCairo
Period12/12/2014/12/20

Keywords

  • Convex Optimization
  • Entropy Regularization
  • Generative Adversarial Networks
  • Generative Models
  • Gradient Descent-Ascent
  • Minimax Game
  • Optimal Transport
  • Sinkhorn Divergence
  • Stochastic Gradient Descent-Ascent

Fingerprint

Dive into the research topics of 'Simultaneous Gradient Descent-Ascent for GANs Minimax Optimization using Sinkhorn Divergence'. Together they form a unique fingerprint.

Cite this