*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
You are kindly invited to attend my Ph.D. proposal presentation. Please see the details below.
Date: May 11, 2021
Time: 10:00 AM EST
Location: https://bluejeans.com/870141711
Machine Learning Ph.D. Student
School of Industrial and Systems Engineering
Georgia Institute of Technology
https://alejandro-carderera.github.io/
Sebastian Pokutta (Advisor)
Guanghui (George) Lan
Swati Gupta
Conditional Gradient (CG) algorithms are an important class of constrained optimization algorithms that eschew the need for projections, relying instead on linear programming oracles to ensure feasibility. In this proposal, we present two algorithms from this family, both of which show improved local convergence rates in the vicinity of the optimum when minimizing smooth and strongly convex functions. The first one, dubbed Locally Accelerated Conditional Gradients uses first-order information about the function and has a local linear convergence over polytopes that improves over that of existing CG-variants. The second algorithm, the Second Order Conditional Gradient Sliding algorithm (inspired by the Conditional Gradient Sliding algorithm), uses second-order information to obtain local quadratic convergence over polytopes under a set of assumptions.