ISYE SEMINAR SERIES - Convergence Behavior of Greedy Algorithms for Some Convex Optimization Problems

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Tuesday November 26, 2002
      10:00 am - 10:59 pm
  • Location: IC 211
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
Barbara Christopher
Industrial and Systems Engineering
Contact Barbara Christopher
404.385.3102
Summaries

Summary Sentence: ISYE SEMINAR SERIES - Convergence Behavior of Greedy Algorithms for Some Convex Optimization Problems

Full Summary: ISYE SEMINAR SERIES - Convergence Behavior of Greedy Algorithms for Some Convex Optimization Problems

Consider the problem of minimizing a convex objective function defined on a possibly infinite dimensional linear vector space spanned by a possibly infinite set of library vectors.

In many engineering applications, we are interested in seeking a vector with near optimal objective value such that it can be sparsely represented as a linear combination of the library vectors. In the literature, greedy algorithm is frequently used to obtain such a sparse representation, where at each step we add an additional library vector to approximately minimize
the objective function.

This problem is typically ill-posed if we regard it as a sparse parameter estimation problem. Consequently simple applications of techniques from traditional numerical convergence analysis lead to unreasonable assumptions that cannot be satisfied in practice. In this talk, I present a new convergence analysis for these methods which is applicable in much greater generality, but predicts a slower worst case convergence rate than the typical (linear or faster) rate in optimization.

In particular, I will focus on the following variants of the problem, each with a slightly different greedy algorithm:

1. Convergence rate when the optimization is constrained in the convex hull of the library vectors.

2. Convergence analysis when the optimization is performed on the whole linear vector space.

3. The near optimal sparse approximation property of a greedy algorithm: conditions and approximation rate.

Additional Information

In Campus Calendar
No
Groups

School of Industrial and Systems Engineering (ISYE)

Invited Audience
No audiences were selected.
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: Barbara Christopher
  • Workflow Status: Published
  • Created On: Oct 8, 2010 - 7:42am
  • Last Updated: Oct 7, 2016 - 9:53pm