Ph.D. Dissertation Defense - Andrew McRae

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Friday April 15, 2022
      12:00 pm - 2:00 pm
  • Location: Room 230, CODA
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Structured Statistical Estimation via Optimization

Full Summary: No summary paragraph submitted.

TitleStructured Statistical Estimation via Optimization

Committee:

Dr. Mark Davenport, ECE, Chair, Advisor

Dr. Justin Romberg, ECE

Dr. Vladimir Koltchinskii, Math

Dr. Vidya Muthukumar, ECE

Dr. Arkadi Nemirovski, ISyE

Abstract: In this thesis, I show how we can exploit low-dimensional structure in high-dimensional statistics and machine learning problems via optimization. I show several settings where, with an appropriate choice of optimization algorithm, we can perform useful estimation with a complexity that scales not with the original problem dimension but with a much smaller intrinsic dimension. In the low-rank matrix completion and denoising problems, we can exploit low-rank structure to recover a large matrix from noisy observations of some or all of its entries. I prove state-of-the-art results for this problem in the case of Poisson noise and show that these results are minimax-optimal. Next, I study the problem of recovering a sparse vector from nonlinear measurements. I present a lifted matrix framework for the sparse phase retrieval and sparse PCA problems that includes a novel atomic norm regularizer. I prove that solving certain convex optimization problems in this framework yields estimators with near-optimal performance. Although we do not know how to compute these estimators efficiently and exactly, we derive a principled heuristic algorithm for sparse phase retrieval that matches existing state-of-the-art algorithms. Third, I show how we can exploit low-dimensional manifold structure in supervised learning. In a reproducing kernel Hilbert space framework, I show that smooth functions on a manifold can be estimated with a complexity scaling with the manifold dimension rather than a larger embedding space dimension. Finally, I study the interaction between high ambient dimension and a lower intrinsic dimension in the harmless interpolation phenomenon (where learned functions generalize well despite interpolating noisy data). I present a general framework for this phenomenon in linear and reproducing kernel Hilbert space settings, proving that it occurs in many situations that previous work has not covered.

Additional Information

In Campus Calendar
No
Groups

ECE Ph.D. Dissertation Defenses

Invited Audience
Public
Categories
Other/Miscellaneous
Keywords
Phd Defense, graduate students
Status
  • Created By: Daniela Staiculescu
  • Workflow Status: Published
  • Created On: Apr 6, 2022 - 4:47pm
  • Last Updated: Apr 6, 2022 - 4:47pm