ISyE Department Seminar Series- David Gamarnik

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday March 13, 2019
      1:30 pm - 2:30 pm
  • Location: ISyE Main Room 228
  • Phone:
  • URL: ISyE Building Complex
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Algorithms and Algorithmic Obstacles in High-Dimensional Regression

Full Summary: ABSTRACT: Many  optimization problems arising in studying of high-dimensional inference problems exhibit an apparent gap between the optimal values which can be estimated by non-constructive means, and the best values achievable by fast (polynomial time) algorithms. Recently it became apparent that a potential and in some cases a provable obstruction for designing algorithms bridging this gap is a phase transition in the geometry of nearly optimal solutions, in particular the presence of a certain Overlap Gap Property (OGP). In this talk we will discuss this property in the context of sparse high dimensional linear regression problem. We show that, on the one hand, in the sampling regime where the known fast methods for this problem are effective, including LASSO, Basis Pursuit, Orthogonal Matching Pursuit,  the space of solutions exhibits a  monotonicity with respect to the proximity to the ground truth regression vector and no local optimums exist apart from the ground truth. On the other hand, once the sampling number is asymptotically in the regime where the known methods fail, we show that the monotonicity is lost, and the model exhibits an OGP. In the context of the regression problem this means every solution exhibiting a small mean squared error is either fairly close to the ground truth or is very far from it, with no middle ground. While for most of the other problems the presence of the OGP indicates a likely hardness of the problem, surprisingly, for the regression problem this is not quite the case. In particular, in the special case of noiseless or very low noise regression problems,  powerful algorithms based on the so-called Lattice Basis Reduction technique arising in the field of cryptography can be used to recover the underlying regression vector tractably. Joint work with Ilias Zadik (MIT).

Additional Information

In Campus Calendar
Yes
Groups

School of Industrial and Systems Engineering (ISYE)

Invited Audience
Faculty/Staff, Postdoc, Public, Graduate students, Undergraduate students
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: sbryantturner3
  • Workflow Status: Published
  • Created On: Feb 12, 2019 - 11:22am
  • Last Updated: Mar 6, 2019 - 9:10am