Registered Data

[02361] A unified framework for convergence analysis of stochastic gradient algorithms with momentum: a linear two-step approach

  • Session Time & Room : 2D (Aug.22, 15:30-17:10) @E506
  • Type : Contributed Talk
  • Abstract : From the viewpoint of weak approximation, the stochastic gradient algorithm and stochastic differential equation are closely related. In this talk, we develop a systematic framework for the convergence of stochastic gradient descent with momentum by exploring the stationary distribution of a linear two-step method applied to stochastic differential equations. Then we prove the convergence of two stochastic linear two-step methods, which are associated with the stochastic heavy ball method and Nesterov's accelerated gradient method.
  • Classification : 65C30, 60H35
  • Format : Online Talk on Zoom
  • Author(s) :
    • Qian Guo (Shanghai Normal University)
    • Fangfang Ma (Shanghai Normal University)