PhD Defense by Kaizhao Sun

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Thursday April 14, 2022
      8:00 pm - 10:00 pm
  • Location: REMOTE
  • Phone:
  • URL: REMOTE
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Decomposition Algorithms based on the Nonconvex Augmented Lagrangian Framework

Full Summary: No summary paragraph submitted.

Thesis Title: Decomposition Algorithms based on the Nonconvex Augmented Lagrangian Framework

 

Advisor: 

Dr. Xu Andy Sun, Operations Research and Statistics, Sloan School of Management, MIT

 

Thesis Committee:

Dr. Santanu S. Dey, School of Industrial and Systems Engineering, Georgia Tech 

Dr. Renato D.C. Monteiro, School of Industrial and Systems Engineering, Georgia Tech 

Dr. Wotao Yin, Decision Intelligence Lab, Damo Academy, Alibaba Group US

Dr. Enlu Zhou, School of Industrial and Systems Engineering, Georgia Tech 

 

Date and Time: Thursday, April 14th, 2022, 8:00 PM (EST)

 

Meeting Link: https://mit.zoom.us/j/94439139936

Meeting ID: 944 3913 9936 

 

Abstract:

 

In this thesis, we develop new decomposition algorithms for several important classes of constrained nonconvex problems. The proposed algorithms take advantage of decomposable structures embedded in the original problem through the augmented Lagrangian framework. Chapters 2-4 are concerned with such structures in constraints, while Chapter 5 investigates a special separable structure in the objective.  

 

Chapter 2 is motivated by the observation that for a broad class of nonlinear constrained optimization problems defined over a network, distributed algorithms based on the alternating direction method of multipliers (ADMM) fail to converge due to certain formulation limitations. To overcome the difficulty, we propose a two-level framework, where the inner level utilizes a structured three-block ADMM to facilitate parallel computations and the outer level applies the classic augmented Lagrange method (ALM) to ensure convergence. We establish global convergence with iteration complexity estimates as well as local convergence results for this new scheme and demonstrate its performance on various nonconvex applications.

 

In Chapter 3, we adopt the two-level framework to solve the AC optimal power flow (OPF) problem, which is a basic building block in electric power systems. The two-level framework tailored to AC OPF provides a new distributed algorithm with global convergence guarantees and iteration complexity estimates. We present extensive numerical experiments on some largest open-sourced power networks (up to 30,000 buses) to demonstrate the speed, robustness, and scalability of the proposed algorithm. 

 

In Chapter 4, we investigate a new class of dual updates termed scaled dual descent (SDD) within the augmented Lagrangian framework. We propose SDD-ADMM, which combines SDD with ADMM, to solve nonlinear equality-constrained multi-block problems. SDD-ADMM improves previous state-of-the-art works of ALM and ADMM in several nontrivial perspectives, including new treatment of nonlinear constraints, less restrictions on problem data, and better iteration complexity results. Moreover, SDD-ADMM admits flexible Gauss-Seidel and Jacobi updates on blocks of variables, making the method particularly suitable for distributed computation. 

 

In Chapter 5, we first study a smoothing technique for a generic difference-of-convex (DC) function, where we replace both convex components by their respective Moreau envelopes. The resulting smooth approximation, termed difference of Moreau Envelopes (DME), is shown to be Lipschitz differentiable, capture stationary, local, and global solutions of the original DC function, and enjoy some growth conditions for broad classes of DC functions. In addition, the DME smoothing provides a powerful tool for algorithmic development: first-order updates on the DME deliver a stationary solution of the original DC function, and we can further combine the DME smoothing with ALM to solve constrained DC programs. An interesting feature that distinguishes DME-based algorithms from existing DC algorithms is that we invoke proximal oracles on the negative component instead of its subgradient oracles, and consequently, the updates on the two convex components can be implemented in parallel.

 

Best regards, 

Kaizhao Sun

 

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Faculty/Staff, Public, Undergraduate students
Categories
Other/Miscellaneous
Keywords
Phd Defense
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Mar 30, 2022 - 10:03am
  • Last Updated: Mar 30, 2022 - 10:03am