ARC Colloquium: Sham Kakade, Microsoft Research, New England

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Monday October 29, 2012 - Tuesday October 30, 2012
      1:00 pm - 12:59 pm
  • Location: Klaus 1116
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact

ndongi@cc.gatech.edu

Summaries

Summary Sentence: Tensor Decompositions for Learning Latent Variable Models

Full Summary: No summary paragraph submitted.

Title:Tensor Decompositions for Learning Latent Variable Models


Abstract:


In many applications, we face the challenge of modeling the interactions between multiple observations. A popular and successful approach in machine learning and AI is to hypothesize the existence of certain latent (or hidden) causes which help to explain the correlations in the observed data.  The (unsupervised) learning problem is to accurately estimate a model with only samples of the observed data.  For example, in document modeling, we may wish to characterize the correlational structure of the "bag of words" in documents. Here, a standard model is to posit that documents are about a few topics (the hidden variables) and that each active topic determines the occurrence of words in the document. The learning problem is, using only the observed words in the documents (and not the hidden topics), to estimate the topic probability vectors (i.e discover the strength by which words tend to appear under different topcis). In practice, a broad class of latent variable models is most often fit with either local search heuristics (such as the EM algorithm) or sampling based approaches.



This talk will discuss how generalizations of standard linear algebra tools (e.g. spectral methods) to tensors provide provable and efficient estimation methods for various latent variable models (under appropriate assumptions), including mixtures of Gaussians models, hidden Markov models, topic models, latent Dirichlet allocation, latent parse tree models (PCFGs and dependency parsers), and models for communities in social networks.  The talk will also briefly discuss how matrix and tensor decomposition methods can be used for the structure learning problem of determining both the existence of certain hidden causes and the underlying graphical structure between these hidden causes and the observed variables.

Additional Information

In Campus Calendar
No
Groups

School of Computer Science, ARC

Invited Audience
No audiences were selected.
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: Elizabeth Ndongi
  • Workflow Status: Published
  • Created On: Aug 31, 2012 - 7:38am
  • Last Updated: Oct 7, 2016 - 9:59pm