Title

Beyond the scope of interior point polynomial time methods: simple methods for extremely large-scale convex programs

Speaker

Arkadi Nemirovski

Technion -- Israel Institute of Technology, Haifa

Abstract

In all known polynomial time algorithms of nonlinear convex optimization, the computational effort per iteration grows nonlinearly with the design dimension $n$ of the problem (typically, as $n^3$). As a result, in the case of extremely large-scale (tens and hundreds thousands of variables) convex programs, a single iteration of a polynomial time algorithm can become too expensive to be practical. In these cases one is enforced to use ``computationally cheap'' (with linear in $n$ complexity of an iteration) optimization techniques. A bad news about ``cheap'' algorithms is that all known methods of this type possess sublinear rate of convergence and thus cannot guarantee high-accuracy solutions. A good news is that with ``favourable geometry'' of a problem, properly chosen cheap algorithms demonstrate dimension-independent and optimal in the large-scale case, in certain precise sense, rate of convergence, which makes these methods well-suited for finding moderate-accuracy solutions of really large-scale problems. In the talk, we overview significant recent developments in the theory of cheap optimization algorithms and illustrate their potential in several applications (3D Medical Imaging, Structural Design, Semidefinite relaxations of combinatorial problems) giving rise to really large (n=100,000 -- 3,000,000) highly nonlinear and nonsmooth convex programs.