Ph.D. Proposal Oral Exam - Yeo Joon Youn

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Thursday September 8, 2022
      9:00 am - 11:00 am
  • Location: Coda C1108 Brookheaven
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Theoretical Analysis of Communication Efficiency, Statistical Heterogeneity, and Privacy in Federated Optimization

Full Summary: No summary paragraph submitted.

Title:  Theoretical Analysis of Communication Efficiency, Statistical Heterogeneity, and Privacy in Federated Optimization

Committee: 

Dr. Abernethy, Advisor

Dr. Muthukumar, Co-Advisor  

Dr. Romberg, Chair

Dr. Tumanov

Abstract: The objective of the proposed research is to design federated optimization algorithms that are communication-efficient, robust to statistical heterogeneity, and privacy-preserving. Federated optimization is a new form of distributed training on very large datasets that leverages many devices each containing local data. In Federated Learning (FL), a number of clients collaboratively learn the global objective function by communicating with a central server without sharing any locally stored data. The challenge of communication efficiency is of primary interest in FL when there is a heavy communication burden with a lot of edge computing devices and limited network bandwidth. Furthermore, a more practical scenario such as heterogeneous local data distribution (statistical heterogeneity) is considered in FL, which is more challenging compared to the traditional distributed optimization framework assuming identically distributed data across the network. Also, privacy-sensitive data on local devices necessitates privacy-preserving training in FL. Thus, we aim to build federated optimization algorithms with theoretical guarantees by tackling each significant issue. The preliminary research includes Federated optimization algorithm with Acceleration and Quantization (FedAQ) which solves the severe communication bottleneck problem in FL systems. The improved theoretical guarantees are achieved by combining an accelerated method of federated averaging, reducing the number of training and synchronization steps, with an efficient quantization scheme that significantly reduces communication complexity. Moreover, we propose a generalized form of global objective functions in FL to make federated optimization algorithms robust in heterogeneous local data distribution settings. Finally, we consider a new quantization scheme with an inherent differential privacy guarantee. This scheme does not require any additional noise and enables the federated optimization algorithm to improve both utility-privacy trade-off and communication efficiency at the same time.

Additional Information

In Campus Calendar
No
Groups

ECE Ph.D. Proposal Oral Exams

Invited Audience
Public
Categories
Other/Miscellaneous
Keywords
Phd proposal, graduate students
Status
  • Created By: Daniela Staiculescu
  • Workflow Status: Published
  • Created On: Sep 7, 2022 - 3:54pm
  • Last Updated: Sep 7, 2022 - 3:54pm