*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
TITLE: Nonparametric Graph Estimation
SPEAKER: Han Liu, Johns Hopkins University
ABSTRACT:
Graphical models have proven to
be an extremely useful abstraction in statistical machine learning. The starting point is the graph of a distribution. While usually the graph is assumed given, we are interested in estimating the graph from data. In this talk we
present
several new nonparametric and semiparametric methods for graph estimation. One approach is named "the nonparanormal," which uses copula methods to transform the variables by monotonic functions, relaxing the fully parametric assumptions made by the Gaussian
graphical model. Another approach is to restrict the family of
allowed graphs to forest graphs, enabling the use of fully nonparametric density estimation. Finally, we introduce a new framework named "graph-valued regression" where the graph is estimated conditioned
on
a recursive partition of a set of covariates. The resulting
methods are easy to understand and compute, theoretically well supported,
and effective for modeling and exploring high dimensional data. Joint
work with Fang Han, John Lafferty, Larry Wasserman, and Tuo Zhao.
Bio: Han
Liu graduated from the joint Ph.D. program in Machine Learning and Statistics at Carnegie Mellon University in 2011.
Since then he became an assistant professor of Biostatistics and
Computer Science at Johns Hopkins University. His research lies at the intersection of Modern Statistics and
Computer Science. Especially, he is interested in large-scale nonparametric methods, which directly conduct inference in infinite-dimensional
spaces and are more flexible to capture the subtleties in modern applications. He is the PI of a NSF grant and has won several
academic awards, including the J. Tinsley Oden faculty fellowship from UT Austin, the best overall paper runner up award at the 26th
international conference of Machine Learning and a PHD fellowship
from Google.