ARC Colloquium: Maxim Raginsky, University of Illinois at Urbana-Champaign

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Monday April 29, 2013 - Tuesday April 30, 2013
      3:00 pm - 2:59 pm
  • Location: Klaus 1116
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: No summary sentence submitted.

Full Summary: No summary paragraph submitted.

Title: Logarithmic Sobolev inequalities and strong data processing theorems for discrete channels

Abstract:

The problem of quantifying the amount of information loss due to a random transformation (or a noisy channel) arises in a variety of contexts, such as machine learning, stochastic simulation, error-correcting codes, or computation in circuits with noisy gates, to name just a few. This talk will focus on discrete channels, where both the input and output sets are finite.  The noisiness of a discrete channel can be measured by comparing suitable functionals of the input and output distributions. For instance, if we fix a reference input distribution, then the worst-case ratio of output relative entropy (Kullback-Leibler divergence) to input relative entropy for any other input distribution is bounded by one, by the data processing theorem. However, for a fixed reference input distribution, this quantity may be strictly smaller than one, giving so-called strong data processing inequalities (SDPIs). I will show that the problem of determining both the best constant in an SDPI and any input distributions that achieve it can be addressed using logarithmic Sobolev inequalities, which relate input relative entropy to certain measures of input-output correlation. I will also show that SDPIs for Kullback-Leibler divergence arises as a limiting case of a family of SDPIs for Renyi divergence, and discuss the relationship to hypercontraction of Markov operators.

 Bio: Maxim Raginsky received the B.S. and M.S. degrees in 2000 and the Ph.D. degree in 2002 from Northwestern University, Evanston, IL, all in electrical engineering. He has held research positions with Northwestern, the University of Illinois at Urbana-Champaign (where he was a Beckman Foundation Fellow from 2004 to 2007), and Duke University. In 2012, he has returned to UIUC, where he is currently an Assistant Professor with the Department of Electrical and Computer Engineering and the Coordinated Science Laboratory. In 2013, Prof. Raginsky has received a Faculty Early Career Development (CAREER) Award from the National Science Foundation. His research interests lie at the intersection of information theory, machine learning, and control.

 

Additional Information

In Campus Calendar
No
Groups

College of Computing, School of Computer Science, ARC

Invited Audience
No audiences were selected.
Categories
No categories were selected.
Keywords
No keywords were submitted.
Status
  • Created By: Elizabeth Ndongi
  • Workflow Status: Published
  • Created On: Mar 18, 2013 - 7:54am
  • Last Updated: Oct 7, 2016 - 10:02pm