SDS Student Journal Seminar, February 12, 3:00pm ENR2 S395

Accelerating Stochastic Gradient Descent using Predictive Variance Reduction

When

3 – 4 p.m., Feb. 12, 2024
The SDS Student Seminars are opportunities for students to read a paper and present the material to the SDS student community.
 
Abstract:
 

Stochastic gradient descent is popular for large scale optimization but has slow convergence asymptotically due to the inherent variance. To remedy this problem, we introduce an explicit variance reduction method for stochastic gradient descent which we call stochastic variance reduced gradient (SVRG). For smooth and strongly convex functions, we prove that this method enjoys the same fast convergence rate as those of stochastic dual coordinate ascent (SDCA) and Stochastic Average Gradient (SAG). However, our analysis is significantly simpler and more intuitive. Moreover, unlike SDCA or SAG, our method does not require the storage of gradients, and thus is more easily applicable to complex problems such as some structured prediction problems and neural network learning