PhD Proposal by Bo Xie

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Friday December 2, 2016
      10:30 am - 12:30 pm
  • Location: KACB 1315
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Algorithms and Analysis for Non-convex Problems in Machine Learning

Full Summary: No summary paragraph submitted.

Title : Algorithms and Analysis for Non-convex Problems in Machine Learning

 

Bo Xie

School of Computational Science and Engineering

College of Computing

Georgia Institute of Technology

 

Date : Friday, December 02, 2016

Time : 10:30 AM to 12:30 PM EST

Location : KACB 1315

 

Committee

-------------

 

Dr. Le Song (Advisor), School of Computational Science and Engineering

Dr. Santosh Vempala, School of Computer Science

Dr. Hongyuan Zha, School of Computational Science and Engineering

 

 

Abstract

-------------

Non-convex problems are ubiquitous in machine learning, ranging from hidden variable models in graphical model estimation, to matrix factorization in recommender systems, and to recently hugely successful deep learning models. Compared with convex problems which have provably efficient algorithms, non-convex problems are in general much harder to solve and most existing algorithms do not have theoretical guarantees on their performance.

 

In this thesis, we propose efficient algorithms and provide novel analysis for many of the important non-convex problems in machine learning. For hidden variable models, we leverage the special tensor structure and design a nonlinear spectral algorithm that avoids the intractability of non-convex optimization. To scale up such nonlinear spectral decomposition, we have also proposed a doubly stochastic gradient algorithm as well as a distributed algorithm that can efficiently run on millions of data points. Moreover, we provide formal analysis on the optimization landscape of one-hidden-layer neural network where any critical point implies a global minimum when the neural weights are diverse enough.

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Public
Categories
Other/Miscellaneous
Keywords
Phd proposal
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Nov 30, 2016 - 11:29am
  • Last Updated: Nov 30, 2016 - 11:30am