*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
TITLE: Convergent subgradient methods for nonsmooth convex minimization
SPEAKER: Yuri Nesterov
ABSTRACT:
In this talk, we present new subgradient methods for solving nonsmooth
convex optimization problems. These methods are the first ones, for which
the whole sequence of test points is endowed with the worst-case performance
guarantees. The methods are derived from a relaxed estimating sequences
condition, and ensure reconstruction of an approximate primal-dual optimal
solutions.
Our methods are applicable as efficient real-time stabilization tools for
potential systems with infinite horizon. As an example, we consider a model
of privacy-respecting taxation, where the center has no information on the
utility functions of the agents. Nevertheless, by a proper taxation policy,
the agents can be forced to apply in average the socially optimal
strategies. Preliminary numerical experiments confirm a high efficiency
of the new methods.
This is a joint work with V.Shikhman (CORE).