Ph.D. Dissertation Defense - Mohit Prabhushankar

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Friday December 3, 2021
      3:00 pm - 5:00 pm
  • Location: https://gatech.bluejeans.com/5145525300
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Contrastive Reasoning in Neural Networks

Full Summary: No summary paragraph submitted.

TitleContrastive Reasoning in Neural Networks

Committee:

Dr. Ghassan AlRegib, ECE, Chair, Advisor

Dr. Mark Davenport, ECE

Dr. Mark Riedl, CoC

Dr. Larry Heck, ECE

Dr. Eva Dyer, BME

Abstract: The objective of the dissertation is to rethink the inductive nature of reasoning in neural networks by providing contextual explanations to a network's decision and addressing the network's robustness capabilities. Neural networks represent data as projections on trained weights in a high dimensional manifold. The trained weights act as a knowledge base consisting of causal class dependencies. Inference built on features that identify dependencies within this manifold is termed as inductive feed-forward inference. This is a classical cause-to-effect inference model that is widely used because of its simple mathematical functionality and ease of operation. Nevertheless, feed-forward models do not generalize well to untrained situations. To alleviate this generalization challenge, we use an effect-to-cause inference model that falls under the abductive reasoning framework. Here, the features represent the change from existing weight dependencies given a certain effect. In this dissertation, we term this change as contrast and the ensuing inference mechanism as contrastive inference. The dissertation tackles contrastive reasoning and inference in three phases: 1) We formalize the structure of contrastive reasons as answers to questions of the form ·why P, rather than Q?' and provide a simple mechanism to extract contrastive reasons from any pre-trained neural network, 2) We analyze our contrastive reasons by creating a taxonomy of evaluation, and 3) Use contrastive reasons to make robust and generalizable decisions. We provide representational and explanatory insights using our contrastive reasoning scheme for diverse visual applications including: 1) Visual cognition-based human attention, 2) P robabilistic modeling of causal and context features, 3) Active learning paradigm of machine learning, and 4) Meta-representations within neural networks.

Additional Information

In Campus Calendar
No
Groups

ECE Ph.D. Dissertation Defenses

Invited Audience
Public
Categories
Other/Miscellaneous
Keywords
Phd Defense, graduate students
Status
  • Created By: Daniela Staiculescu
  • Workflow Status: Published
  • Created On: Nov 17, 2021 - 4:10pm
  • Last Updated: Nov 17, 2021 - 4:14pm