Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling - Multidisciplinary Institute in Artificial intelligence - Grenoble Alpes Access content directly
Conference Papers Year : 2020

Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling

Abstract

Owing to their stability and convergence speed, extragradient methods have become a staple for solving large-scale saddle-point problems in machine learning. The basic premise of these algorithms is the use of an extrapolation step before performing an update; thanks to this exploration step, extragradient methods overcome many of the non-convergence issues that plague gradient descent/ascent schemes. On the other hand, as we show in this paper, running vanilla extragradient with stochastic gradients may jeopardize its convergence, even in simple bilinear models. To overcome this failure, we investigate a double stepsize extragradient algorithm where the exploration step evolves at a more aggressive timescale compared to the update step. We show that this modification allows the method to converge even with stochastic gradients, and we derive sharp convergence rates under an error bound condition.
Fichier principal
Vignette du fichier
dseg_supp.pdf (8.28 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-03002844 , version 1 (13-11-2020)

Identifiers

  • HAL Id : hal-03002844 , version 1

Cite

Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos. Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling. NeurIPS '20 - 34th International Conference on Neural Information Processing Systems, Dec 2020, Vancouver / Virtual, Canada. pp.16223--16234. ⟨hal-03002844⟩
172 View
214 Download

Share

Gmail Facebook X LinkedIn More