*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
TITLE: Lower Complexity Bounds for Large-Scale Smooth Convex Optimization
SPEAKER: Cristobal Guzman
ABSTRACT:
We prove lower bounds on the black-box oracle complexity of large-scale
smooth convex minimization problems. These lower bounds work for unit balls
of normed spaces under a technical smoothing condition, and arbitrary
smoothness parameter of the objective with respect to this norm. As a
consequence, we show a unified framework for the complexity of convex
optimization on \ell^p-balls, for 2<=p<=\infty. In particular, we prove
that the T-step Conditional Gradient algorithm as applied to minimizing
smooth convex functions over the n-dimensional box with T<=n is nearly
optimal.
On the other hand, we prove lower bounds for the complexity of convex
optimization over \ell^p-balls, for 1<=p<2, by combining a random subspace
method with the p=\infty lower bounds. In particular, we establish the
complexity of problem classes that contain the sparse recovery problem.
This is joint work with Arkadi Nemirovski