CSE Seminar: Yoram Singer

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Friday September 3, 2010 - Saturday September 4, 2010
      2:00 pm - 2:59 pm
  • Location: Klaus 1447
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact

For more information, contact Guy Lebanon.

Summaries

Summary Sentence: "Composite Objective Optimization and Learning for Massive Datasets"

Full Summary: No summary paragraph submitted.

CSE Seminar

Speaker: Yoram Singer

Date: Friday, Sept. 3, 2010

Time: 2-3 p.m.

Location: KACB 1447

 

Title

Composite Objective Optimization and Learning for Massive Datasets

 

Abstract

Composite objective optimization is concerned with the problem of minimizing a two-term objective function which consists of an empirical loss function and a regularization function. Application with massive datasets often employ a regularization term which is non-differentiable or structured, such as L1 or mixed-norm regularization. Such regularizers promote sparse solutions and special structure of the parameters of the problem, which is a desirable goal for datasets of extremely high-dimensions. In this talk, we discuss several recently developed methods for performing composite objective minimization in the online learning and stochastic optimization settings. We start with a description of extensions of the well-known forward-backward splitting method to stochastic objectives. We then generalize this paradigm to the family of mirror-descent algorithms. Our work builds on recent work which connects proximal minimization to online and stochastic optimization. We focus in the algorithmic part on a new approach, called AdaGrad, in which the proximal function is adapted throughout the course of the algorithm in a data-dependent manner. This temporal adaptation metaphorically allows us to find needles in haystacks as the algorithm is able to single out very predictive yet rarely observed features. We conclude with several experiments on large-scale datasets that demonstrate the merits of composite objective optimization and underscore superior performance of various instantiations of AdaGrad.

 

Bio

Yoram Singer is a senior research scientist at Google. From 1999 through 2007 he was an associate professor at the Hebrew university of Jerusalem, Israel. He was member of the technical staff at AT&T Research from 1995 through 1999. He served as an associate editor of Machine Learning Journal and is now on the editorial board of the Journal of Machine Learning Research and IEEE Signal Processing Magazine. He was the co-chair of COLT'04 and NIPS'07. He is a AAAI Fellow and won for several awards for his research papers, most recently the 10 years retrospect award for the most influential paper of ICML 2000.

Additional Information

In Campus Calendar
No
Groups

College of Computing, School of Computational Science and Engineering

Invited Audience
No audiences were selected.
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: Mike Terrazas
  • Workflow Status: Published
  • Created On: Aug 24, 2010 - 9:10am
  • Last Updated: Oct 7, 2016 - 9:52pm