ML@GT Virtual Seminar: Robert Nowak, University of Wisconsin-Madison

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday October 7, 2020
      12:15 pm - 1:15 pm
  • Location: Virtual - Bluejeans
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact

Allie McFadden

Communications Officer

allie.mcfadden@cc.gatech.edu

Summaries

Summary Sentence: Nowak will give a virtual seminar as a part of the ML@GT Seminar Series.

Full Summary: No summary paragraph submitted.

Active Learning: From Linear Classifiers to Overparameterized Neural Networks

 

Robert Nowak will give a virtual seminar on October 7, 2020. Please check back soon for registration and talk details.

Register: https://primetime.bluejeans.com/a2m/register/vzjxvxjh

Abstract:

The field of Machine Learning (ML) has advanced considerably in recent years, but mostly in well-defined domains using huge amounts of human-labeled training data. Machines can recognize objects in images and translate text, but they must be trained with more images and text than a person can see in nearly a lifetime.  The computational complexity of training has been offset by recent technological advances, but the cost of training data is measured in terms of the human effort in labeling data. People are not getting faster nor cheaper, so generating labeled training datasets has become a major bottleneck in ML pipelines. 

Active ML aims to address this issue by designing learning algorithms that automatically and adaptively select the most informative examples for labeling so that human time is not wasted labeling irrelevant, redundant, or trivial examples. This talk explores the development of active ML theory and methods over the past decade, including a new approach applicable to kernel methods and neural networks, which views the learning problem through the lens of representer theorems. This perspective highlights the effect that adding a given training example has on the representation.   The new approach is shown to possess a variety of desirable mathematical properties that allow active learning algorithms to learn good classifiers from few labeled examples.

About Robert:

Nowak holds the Nosbusch Professorship in Engineering at the University of Wisconsin-Madison, where his research focuses on signal processing, machine learning, optimization, and statistics.

Additional Information

In Campus Calendar
Yes
Groups

College of Computing, Computational Science and Engineering, GVU Center, Machine Learning, ML@GT, OMS, School of Computational Science and Engineering, School of Computer Science, School of Interactive Computing

Invited Audience
Faculty/Staff, Postdoc, Public, Graduate students, Undergraduate students
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: ablinder6
  • Workflow Status: Published
  • Created On: Aug 13, 2020 - 11:25am
  • Last Updated: Sep 29, 2020 - 9:36am