PhD Thesis Defense: Alejandro Carderera – Faster Conditional Gradient algorithms for Machine Learning

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Monday November 29, 2021
      11:30 am - 12:30 pm
  • Location: Atlanta and Virtual
  • Phone:
  • URL: Atlanta and Virtual
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact

Stephanie Niebuhr
Academic Advisor

Summaries

Summary Sentence: PhD Thesis Defense: Alejandro Carderera – Faster Conditional Gradient algorithms for Machine Learning

Full Summary: No summary paragraph submitted.

Media
  • Alejandro Carderera – PhD Machine Learning Student Alejandro Carderera – PhD Machine Learning Student
    (image/jpeg)

Join virtually here
 

Advisor

Prof. Sebastian Pokutta, Institute of Mathematics, Technische Universität Berlin and Department for AI in Society, Science, and Technology, Zuse Institute Berlin (Advisor)

 

Committee

1. Prof. Alexandre d'Aspremont, Département d'Informatique, CNRS and École Normale Supérieure

2. Prof. Jelena Diakonikolas, Department of Computer Science, University of Wisconsin–Madison

3. Prof. Swati Gupta, School of Industrial and Systems Engineering, Georgia Institute of Technology

4. Prof. Guanghui Lan, School of Industrial and Systems Engineering, Georgia Institute of Technology

 

Abstract

In this thesis, we focus on Frank-Wolfe (a.k.a. Conditional Gradient) algorithms, a family of iterative algorithms for convex optimization, that work under the assumption that projections onto the feasible region are prohibitive, but linear optimization problems can be efficiently solved over the feasible region. We present several algorithms that either locally or globally improve upon existing convergence guarantees. In Chapters 2-4 we focus on the case where the objective function is smooth and strongly convex and the feasible region is a polytope, and in Chapter 5 we focus on the case where the function is generalized self-concordant and the feasible region is a compact convex set.

Additional Information

In Campus Calendar
No
Groups

ML@GT

Invited Audience
Faculty/Staff, Public, Undergraduate students
Categories
No categories were selected.
Keywords
No keywords were submitted.
Status
  • Created By: Joshua Preston
  • Workflow Status: Published
  • Created On: Nov 29, 2021 - 9:03am
  • Last Updated: Nov 29, 2021 - 9:19am