ISyE Seminar - Mengdi Wang

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday September 28, 2016 - Thursday September 29, 2016
      3:00 pm - 2:59 pm
  • Location: Advisory Boardroom Groseclose 402
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact

Alex Shapiro/Andy Sun

Summaries

Summary Sentence: ISyE Seminar - Mengdi Wang

Full Summary: No summary paragraph submitted.

Title: Stochastic Nested Composition Optimization and Beyond

 

Abstract:

Classical stochastic optimization models usually involve expected-value objective functions. However, they do not apply to the minimization of a composition of two or multiple expected-value functions, i.e., the stochastic nested composition optimization problem.

Stochastic composition optimization finds wide application in estimation, risk-averse optimization, dimension reduction and reinforcement learning. We propose a class of stochastic compositional first-order methods. We prove that the algorithms converge almost surely to an optimal solution for convex optimization problems (or a stationary point for nonconvex problems), as long as such a solution exists.

The convergence involves the interplay of two martingales with different timescales. We obtain rate of convergence results under various assumptions, and show that the algorithms achieve the optimal sample-error complexity in several important special cases. These results provide the best-known rate benchmarks for stochastic composition optimization. We demonstrate its application to statistical estimation and reinforcement learning. In addition, we also introduce some recent developments on nonconvex statistical optimization.

 

Bio:

Mengdi Wang’s research focus is stochastic data-driven optimization in machine learning, data analysis, and intelligent systems. She received her PhD from Massachusetts Institute of Technology in 2013 and became an assistant professor at Princeton in 2014. She received the Young Researcher Prize in Continuous Optimization of the Mathematical Optimization Society in 2016 (awarded once every three years).

Additional Information

In Campus Calendar
No
Groups

School of Industrial and Systems Engineering (ISYE)

Invited Audience
Undergraduate students, Faculty/Staff, Graduate students
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: Anita Race
  • Workflow Status: Draft
  • Created On: Sep 1, 2016 - 6:15am
  • Last Updated: Apr 13, 2017 - 5:14pm