*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Dr. Xiaoming Huo
Associate Professor at Georgia Institute of Technology
School of Industrial and Systems Engineering
For more information please contact Dr. George Biros at gbiros@cc.gatech.edu
"Nonlinear Models Motivated by Manifold-Based Learning"
Abstract:
In statistical modeling, a well-adopted framework is as follows: consider y = f(X), where response y is univariate, predictor X can be multivariate, and function f is what we try to identify. Such a formulation covers nearly all statistical models that we have encountered: e.g., linear (regression) model or parametric models with g being linear or belonging to a specific parametric family; nonlinear models such as kernel methods, splines, local polynomials, etc., when g has a particular form. We consider a variation of the above by reconsidering the penalized estimation framework: finding f which optimize a function having form G(f) + R(f), where G(f) is called a goodness of fit measure and R(f) measures the regularity of the functional solution f. It is known that under certain circumstances, the solutions to the above has a close-form solutions: e.g., when G(f) is the residuals sum of squares and R(f) is the integrated square of the second derivative of f, the minimizer f is the finite-dimensional cubic smoothing splines. It is also known that the above does not have a direct generalization to high-dimensional X. On the other hand, the date with high-dimensional X is frequently encountered in practice. We introduce an alternative to R(f), which leads to close-form solution and easy numerical implementation. Experiments with both synthetic and real data demonstrate the superiority of our method. The resulting model is nonlinear, not covered by any existing family of models, and requires very little assumption on the underlying f. Potential theoretical results will be discussed. The newly introduced R(f) is motivated by the work of hessian eigenmaps -- a manifold-based dimension reduction algorithm.
Bio:
Xiaoming Huo received the B.S. degree in mathematics from the University of Science and Technology, China, in 1993, and the M.S. degree in electrical engineering and the Ph.D. degree in statistics from Stanford University, Stanford, CA, in 1997 and 1999, respectively. Since August 2006, he has been an Associate Professor with the School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta. He represented China in the 30th International Mathematical Olympiad (IMO), which was held in Braunschweig, Germany, in 1989, and received a golden prize.
His research interests include statistics and multiscale methodology. He has made numerous contributions on topics such as sparse representation, wavelets, and statistical problems in detectability. His papers appeared in top journals, and some of them are highly cited. See his publication list for details.
Please join us for a reception preceding the seminar outside Klaus 1324, beginning at 1:30pm.
To receive future announcements, please sign up to the cse-seminar email list: https://mailman.cc.gatech.edu/mailman/listinfo/cse-seminar