*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Title: Overcoming Process Variations and Noise in Analog Neural Networks
Committee:
Dr. Anderson, Advisor
Dr. Lanterman, Chair
Dr. Raychowdhury
Abstract: The objective of the proposed research is to develop training algorithms and network architectures that limit the impact that device variations have on the performance of analog-hardware implemented neural network classifiers. Traditional machine learning algorithms and neural networks are implemented using powerful digital computational architectures such as GPUs, TPUs, and FPGAs, demonstrating high performance and successfully completing previously impossible tasks. Unfortunately, the power required to train and generate predictions with the neural networks is too high to be implemented in energy-constrained systems such as implants and edge devices. Implementing neural networks in analog hardware is one possible solution to this challenge, but analog devices suffer from process, voltage, temperature (PVT), and other variations that limit their precision. This work expects that there are some neural network architectures that will suppress noise better than others and seeks to discover how to prepare a neural network model that is bound for hardware implementation to be resilient to non-idealities.