ISyE Seminar - Jorge Nocedal

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday October 17, 2018
      1:30 pm - 2:30 pm
  • Location: Groseclose 402
  • Phone:
  • URL: ISyE Building Complex
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: ISyE Seminar - Jorge Nocedal

Full Summary: Zero-Order and Second-Order Optimization Methods with Applications in Machine Learning Jorge Nocedal Northwestern University   We begin by proposing an optimization algorithm that employs only function values and is able to solve noisy problems in thousands of variables. We then consider the problem of training deep neural networks and note that although most high-dimensional nonconvex optimization problems cannot be solved to optimality, deep neural networks have a benign geometry that allows stochastic optimization methods find acceptable solutions. There are, nevertheless, many open questions concerning the optimization process, including trade-offs between parallelism and the predictive ability of solutions.  We discuss classical and new optimization methods in the light of these observations.

Zero-Order and Second-Order Optimization Methods with Applications in Machine Learning

Jorge Nocedal

Northwestern University

 

We begin by proposing an optimization algorithm that employs only function values and is able to solve noisy problems in thousands of variables. We then consider the problem of training deep neural networks and note that although most high-dimensional nonconvex optimization problems cannot be solved to optimality, deep neural networks have a benign geometry that allows stochastic optimization methods find acceptable solutions. There are, nevertheless, many open questions concerning the optimization process, including trade-offs between parallelism and the predictive ability of solutions.  We discuss classical and new optimization methods in the light of these observations.

 

-------

Bio
Jorge Nocedal is the Walter P. Murphy Professor in the Department of Industrial Engineering and Management Sciences at Northwestern University. His research is in optimization, both deterministic and stochastic, and with emphasis on very large-scale problems.  His current work is driven by applications in machine learning. He is a SIAM Fellow, was awarded the 2012 George B. Dantzig Prize and the 2017 Von Neumann Theory Prize for contributions to theory and algorithms of nonlinear optimization. 

Additional Information

In Campus Calendar
Yes
Groups

School of Industrial and Systems Engineering (ISYE)

Invited Audience
Faculty/Staff, Public, Graduate students, Undergraduate students
Categories
Seminar/Lecture/Colloquium
Keywords
optimization, nonconvex optimization, parallelism, predictive ability
Status
  • Created By: nhendricks6
  • Workflow Status: Published
  • Created On: Aug 24, 2018 - 12:25pm
  • Last Updated: Oct 4, 2018 - 8:33am