ARC Colloquium: Chandra Chekuri (Univ. of Illinois)

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Monday April 11, 2022
      11:00 am - 12:00 pm
  • Location: Klaus 1116 East
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Densest Subgraph: Supermodularity, Iterative Peeling, and Flow - Klaus 1116 East at 11am

Full Summary: No summary paragraph submitted.

Algorithms & Randomness Center (ARC)

Chandra Chekuri (Univ. of Illinois)

Monday, April 11, 2022

Klaus 1116 East - 11:00 am

 

Title:  Densest Subgraph: Supermodularity, Iterative Peeling, and Flow

Abstract: The densest subgraph problem in a graph (DSG), in the simplest form, is the following. Given an undirected graph G = (V, E) find a subset S of vertices that maximizes the ratio |E(S)|/|S| where E(S) is the set of edges with both endpoints in S. DSG and several of its variants are well-studied in theory and practice and have many applications in data mining and network analysis. We study fast algorithms and structural aspects of DSG via the lens of supermodularity. For this we consider the densest supermodular subset problem (DSS): given a non-negative supermodular function f over a ground set V, maximize f(S)/|S|.

For DSG we describe a simple flow-based algorithm that outputs a (1−\epsilon)-approximation in deterministic O(m polylog(n)/\epsilon) time where m is the number of edges.

Greedy peeling algorithms have been very popular for DSG and several variants due to their efficiency, empirical performance, and worst-case approximation guarantees. We describe a simple peeling algorithm for DSS and analyze its approximation guarantee in a fashion that unifies several existing results. Boob et al. developed an iterative peeling algorithm for DSG which appears to work very well in practice, and made a conjecture about its convergence to optimality. We affirmatively answer their conjecture, and in fact prove that a natural generalization of their algorithm converges for any supermodular function f; the key to our proof is to consider an LP formulation that is derived via the Lovász extension of a supermodular function which is inspired by the LP relaxation of Charikar that has played an important role in several developments.

This is joint work with Kent Quanrud and Manuel Torres.

----------------------------------

Speaker's Webpage

Videos of recent talks are available at: https://smartech.gatech.edu/handle/1853/46836 and  http://arc.gatech.edu/node/121

Click here to subscribe to the seminar email list: arc-colloq@Klauscc.gatech.edu

Additional Information

In Campus Calendar
No
Groups

ARC

Invited Audience
Faculty/Staff, Postdoc, Graduate students, Undergraduate students
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: Francella Tonge
  • Workflow Status: Published
  • Created On: Jan 18, 2022 - 2:53pm
  • Last Updated: Mar 24, 2022 - 11:14am