CAM Colloquium-- Katya Scheinberg, ORIE, Cornell University "Overview of adaptive stochastic optimization methods"

Location

655 Rhodes Hall

Description

Abstract: Continuous optimization is a mature field, which has recently undergone major expansion and change. One of the key new directions is the development of methods that do not require exact information about the objective function. Nevertheless, the majority of these methods, from stochastic gradient descent to "zero-th order" methods use some kind of approximate first order information. We will introduce a general definition of a stochastic oracle and show how this definition applies in a variety of familiar settings. We will then discuss how these oracles can be used by stochastic variants of adaptive methods, which  include stochastic step search, trust region and cubicly regularized Newton methods. Such methods adapt the step size parameter and use it to dictate the accuracy required of the oracles. The requirements on the oracles, thus, are also adaptive. The step size parameters in these methods can increase and decrease based on the perceived progress, but unlike the deterministic case they are not bounded away from zero. This creates obstacles in complexity analysis of such methods. We will show how one can develop bounds on expected complexity that also apply in high probability.  We will discuss various stochastic settings, where the framework easily applies,  such as expectation minimization, black box and simulation optimization, expectation minimization with corrupt samples, etc.

Biography: Katya Scheinberg is a Professor and Director of Graduate Studies at the School of Operations Research and Information Engineering at Cornell University. Prior to joining Cornell she was the Harvey E. Wagner Endowed Chair Professor at the Industrial and Systems Engineering Department at Lehigh University. She attended Moscow University for her undergraduate studies and received her PhD degree from Columbia University. She worked at the IBM T.J. Watson Research Center as a research staff member for over a decade before joining Lehigh in 2010.

Katya’s main research areas are related to developing practical algorithms (and their theoretical analysis) for various problems in continuous optimization, such as convex optimization, derivative free optimization, machine learning, quadratic programming, etc.

She is an Informs Fellow, a recipient of the Lagrange Prize from  SIAM and MOS, the Farkas Prize from Informs Optimization Society and the Outstanding Simulation Publication award from Informs Simulation Society.  Katya is currently the editor-in-chief of Mathematics of Operations Research, and co-editor of Mathematical Programming. She served as  the Chair of SIAM Activity Group on Optimization from 2020 until 2022.