Ph.D. Proposal Oral Exam - Chen Zhou

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday February 22, 2023
      1:00 pm - 3:00 pm
  • Location: Room 5126, Centergy and https://gatech.zoom.us/j/94104392996?
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Learning Representations to Tackle Human Uncertainty

Full Summary: No summary paragraph submitted.

Title:  Learning Representations to Tackle Human Uncertainty

Committee: 

Dr. AlRegib, Advisor    

Dr. Heck, Chair

Dr. Dyer

Abstract: The objective of the proposed research is to tackle human uncertainty from two perspectives, intentional uncertainty, and accidental uncertainty, via learning model representations. Humans exhibit uncertainty that can be either intentional or accidental. Intentional human uncertainty can manifest into multi-modality in human behavior. Multi-modal nature is induced by individuals manipulating predictive outcomes of decision cues. Intentional human uncertainty is essential in safety-critical applications, where we aim to capture diverse modes of human behavior. Specifically, we develop a generative model that generates a distribution of intended goals, via variational inference in the latent manifold, to better capture human intentional uncertain behavior in test data. Besides intentional uncertainty, humans show accidental uncertainty when ambiguity in perception occurs. Perceptual ambiguity can lead to divergence between interpretations of data by multiple humans. This divergence manifests in the disagreement between human annotators during data labeling processes and causes undue effects on the model’s uncertainty and generalizability. We aim to mitigate the undue effects caused by accidental uncertainty. Specifically, we develop a label dilution scheme, by exploiting natural scene statistics, to alleviate the performance degradation while avoiding massive separate human annotations. In summary, we study and tackle human uncertainty from both intentional and accidental perspectives in this dissertation proposal.

Additional Information

In Campus Calendar
No
Groups

ECE Ph.D. Proposal Oral Exams

Invited Audience
Public
Categories
Other/Miscellaneous
Keywords
Phd proposal, graduate students
Status
  • Created By: Daniela Staiculescu
  • Workflow Status: Published
  • Created On: Feb 21, 2023 - 9:05am
  • Last Updated: Feb 21, 2023 - 9:05am