ML Ph.D. Proposal - Alejandro Carderera

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Tuesday May 11, 2021
      10:00 am - 12:00 pm
  • Location: Virtual - https://bluejeans.com/870141711
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Alejandro Carderera will present his machine learning Ph.D. proposal.

Full Summary: No summary paragraph submitted.

You are kindly invited to attend my Ph.D. proposal presentation. Please see the details below.

 

Title: Local Acceleration of the Frank-Wolfe Algorithm

Date: May 11, 2021

Time: 10:00 AM EST

Location: https://bluejeans.com/870141711

 

Alejandro Carderera

Machine Learning Ph.D. Student

School of Industrial and Systems Engineering
Georgia Institute of Technology

https://alejandro-carderera.github.io/

 

Committee

Sebastian Pokutta (Advisor)

Guanghui (George) Lan

Swati Gupta

 

Abstract

Conditional Gradient (CG) algorithms are an important class of constrained optimization algorithms that eschew the need for projections, relying instead on linear programming oracles to ensure feasibility. In this proposal, we present two algorithms from this family, both of which show improved local convergence rates in the vicinity of the optimum when minimizing smooth and strongly convex functions. The first one, dubbed Locally Accelerated Conditional Gradients uses first-order information about the function and has a local linear convergence over polytopes that improves over that of existing CG-variants. The second algorithm, the Second Order Conditional Gradient Sliding algorithm (inspired by the Conditional Gradient Sliding algorithm), uses second-order information to obtain local quadratic convergence over polytopes under a set of assumptions.

Additional Information

In Campus Calendar
No
Groups

ML@GT

Invited Audience
Faculty/Staff, Postdoc, Public, Graduate students, Undergraduate students
Categories
Other/Miscellaneous
Keywords
No keywords were submitted.
Status
  • Created By: ablinder6
  • Workflow Status: Published
  • Created On: May 4, 2021 - 1:51pm
  • Last Updated: May 4, 2021 - 1:51pm