ISyE Seminar- Elad Hazan

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday November 29, 2017 - Thursday November 30, 2017
      3:00 pm - 3:59 pm
  • Location: Groseclose 402
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: ISyE Seminar- Elad Hazan

Full Summary: No summary paragraph submitted.

TITLE:   Efficient Second-order Optimization for Machine Learning

ABSTRACT: 

Stochastic gradient-based methods are the state-of-the-art in large-scale machine learning optimization due to their extremely efficient per-iteration computational cost. Second-order methods, that use the second derivative of the optimization objective, are known to enable faster convergence. However, the latter have been much less explored due to the high cost of computing the second-order information. We will present second-order stochastic methods for (convex and non-convex) optimization problems arising in machine learning that match the per-iteration cost of gradient descent, yet enjoy the faster convergence properties of second-order optimization. 

Joint work with Naman Agarwal and Brian Bullins (ICML '16), and Agarwal, Bullins, Allen-Zhu and Ma (STOC '17)

 

 

BIO: 

Elad Hazan is a professor of computer science at Princeton university. He joined in 2015 from the Technion, where he had been an associate professor of operations research. His research focuses on the design and analysis of algorithms for fundamental problems in machine learning and optimization. Amongst his contributions are the co-development of the AdaGrad algorithm for training learning machines, and the first sublinear-time algorithms for convex optimization. He is the recipient of (twice) the IBM Goldberg best paper award in 2012 for contributions to sublinear time algorithms for machine learning, and in 2008 for decision making under uncertainty, a European Research Council grant , a Marie Curie fellowship and Google Research Award (twice), and winner of the Bell Labs Prize. He serves on the steering committee of the Association for Computational Learning and has been program chair for COLT 2015.

Additional Information

In Campus Calendar
No
Groups

School of Industrial and Systems Engineering (ISYE)

Invited Audience
Faculty/Staff, Public, Undergraduate students
Categories
No categories were selected.
Keywords
No keywords were submitted.
Status
  • Created By: nhendricks6
  • Workflow Status: Published
  • Created On: Aug 24, 2017 - 8:47am
  • Last Updated: Aug 24, 2017 - 2:06pm