*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Title: Learning Representations for Missing Information in Neural Networks
Committee:
Dr. AlRegib, Advisor
Dr. Romberg, Chair
Dr. Davenport
Abstract:
The objective of the proposed research is to create novel representations for information that neural networks have not learned through training data. Recent studies reveal the vulnerability of deep neural networks against unexpected input which differs from training data. A factor that contributes to the vulnerability of deep networks is the limitation in activation-based representations. The activation-based representation is a representation obtained in the form of activation and constructed by weights that contain specific knowledge learned from training data. Since the activation-based representation is inherently focused on learned information, the unexpected input cannot be properly handled by relying on the activation-based representation. We propose to complement the activation-based representation by the information that could be obtained from the behavior of deep networks when they face unexpected inputs. In particular, we investigate gradient-based representations to characterize missing information that deep networks have not learned. The proposed research generalizes the geometric interpretation of gradients to practical scenarios of deep learning. In addition, we validate the effectiveness of the gradient-based representation in characterizing missing information for deep networks.