*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Final doctoral examination and defense of dissertation of Cristobal Guzman
Title: Information, Complexity and Structure in Convex Optimization
Time: Monday, March 30, 9:00am
Location: Academic Conference Room - Groseclose Building - Room 204
Advisors: Prof. Arkadi Nemirovski and Prof. Sebastian Pokutta
Committee:
Prof. Arkadi Nemirovski, School of Industrial and Systems Engineering
Prof. Sebastian Pokutta, School of Industrial and Systems Engineering
Prof. Shabbir Ahmed, School of Industrial and Systems Engineering
Prof. Santosh Vempala, School of Computer Science
Prof. Alexandre d'Aspremont, Department d'Informatique, Ecole Normale
Superieure
Reader: Prof. Alexandre d'Aspremont, Department d'Informatique, Ecole
Normale Superieure
SUMMARY: This thesis is focused on the limits of performance of large-scale
convex optimization algorithms. Classical theory of oracle complexity,
first proposed by Nemirovski and Yudin in 1983, successfully established
the worst-case behavior of methods based on local oracles (a generalization
of first-order oracle for smooth functions) for nonsmooth convex
minimization, both in the large-scale and low-scale regimes; and the
complexity of approximately solving linear systems of equations (equivalent
to convex quadratic minimization) over Euclidean balls, under a
matrix-vector multiplication oracle. Our work extends the applicability of
lower bounds in two directions: Worst-Case Complexity of Large-Scale Smooth
Convex Optimization: We generalize lower bounds on the complexity of
first-order methods for convex optimization, considering classes of convex
functions with Holder continuous gradients. Our technique relies on the
existence of a smoothing kernel, which defines a smooth approximation for
any convex function via infimal convolution. As a consequence, we derive
lower bounds for ell_p/ell_q-setups, where 1 <= p,q <= \infty, and extend
to its matrix analogue: Smooth (w.r.t. Schatten q-norm) convex minimization
over matrices with bounded Schatten p-norm. The major consequences of this
result are the near-optimality of the Conditional Gradient method over
box-type domains (p=q=\infty), and the near-optimality of Nesterov's
accelerated method over the cross-polytope (p=q=1).
The thesis is available for public inspection in the School of
Mathematics lounge (Skiles 236), the ARC lounge (Klaus 2222), the ISyE
PhD student lounge, and at the URL
http://www.aco.gatech.edu/dissert/guzman.html