CANCELED - ISyE Seminar - Courtney Paquette

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Tuesday February 22, 2022
      11:00 am - 12:00 pm
  • Location: Groseclose 402
  • Phone:
  • URL: ISyE Building
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Stochastic Algorithms in the Large: Exact Dynamics, Average-case Analysis, and Stepsize Criticality

Full Summary:

Abstract: 

In this talk, I will present a framework, inspired by random matrix theory, for analyzing the dynamics of stochastic algorithms (e.g., stochastic gradient descent (SGD) and momentum) when both the number of samples and dimensions are large. Using this new framework, we show that the dynamics of stochastic algorithms on a least squares problem with random data become deterministic in the large sample and dimensional limit. Furthermore, the limiting dynamics are governed by a Volterra integral equation. This model predicts that SGD undergoes a phase transition at an explicitly given critical stepsize that ultimately affects its convergence rate, which we also verify experimentally. Finally, when input data is isotropic, we provide explicit expressions for the dynamics and average-case convergence rates. These rates show significant improvement over the worst-case complexities.

Title:

Stochastic Algorithms in the Large: Exact Dynamics, Average-case Analysis, and Stepsize Criticality

 

Abstract: 

In this talk, I will present a framework, inspired by random matrix theory, for analyzing the dynamics of stochastic algorithms (e.g., stochastic gradient descent (SGD) and momentum) when both the number of samples and dimensions are large. Using this new framework, we show that the dynamics of stochastic algorithms on a least squares problem with random data become deterministic in the large sample and dimensional limit. Furthermore, the limiting dynamics are governed by a Volterra integral equation. This model predicts that SGD undergoes a phase transition at an explicitly given critical stepsize that ultimately affects its convergence rate, which we also verify experimentally. Finally, when input data is isotropic, we provide explicit expressions for the dynamics and average-case convergence rates. These rates show significant improvement over the worst-case complexities.

 

Bio: 

Courtney Paquette is an assistant professor at McGill University and a CIFAR Canada AI chair. Paquette’s research broadly focuses on designing and analyzing algorithms for large-scale optimization problems, motivated by applications in data science. She received her PhD from the mathematics department at the University of Washington (2017), held postdoctoral positions at Lehigh University (2017-2018) and University of Waterloo (NSF postdoctoral fellowship, 2018-2019), and was a research scientist at Google Research, Brain Montreal (2019-2020).

Additional Information

In Campus Calendar
Yes
Groups

School of Industrial and Systems Engineering (ISYE)

Invited Audience
Faculty/Staff, Postdoc, Public, Graduate students, Undergraduate students
Categories
Conference/Symposium
Keywords
No keywords were submitted.
Status
  • Created By: Julie Smith
  • Workflow Status: Published
  • Created On: Jan 4, 2022 - 8:05am
  • Last Updated: Feb 18, 2022 - 4:55pm