- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
Presenter: Fabian Schaipp (Technical University of Munich)
Title: Stochastic Proximal Point and the Polyak step size
In the first part of the talk, we present an implementable stochastic proximal point (SPP) method for a class of weakly convex, composite optimization problems. The proposed stochastic proximal point algorithm incorporates a variance reduction mechanism and the resulting SPP updates are solved using an inexact semi smooth Newton framework. We establish convergence results that take the inexactness of the SPP steps into account and that are in accordance with existing convergence guarantees of (proximal) stochastic variance-reduced gradient methods. Numerical experiments (in the context of sparse logistic regression, robust regression) show that the proposed algorithm compares favorably with other state-of-the-art methods and achieves higher robustness with respect to the step size selection.
The second part deals with the recently proposed (Loizou et al., AISTATS 2021) stochastic Polyak step size (SPS), which has a promising performance for training several deep learning tasks. While the method benefits from models that interpolate all data points, this situation is unlikely when using regularization and opens the question of how the method can be adapted appropriately. To do so, we first motivate SPS from the viewpoint of model-based stochastic proximal point which further allows us to derive a proximal version for l2-regularized problems.
Please email crampersad@flatironinstitute.org for the Zoom link.