*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Date: Friday, Oct. 15, 2021
Time: 3:00pm
Bluejeans link: https://bluejeans.com/4658604304
Speaker: Jun Qi
Affiliation: Georgia Tech, School of Electrical and Computer Engineering
Seminar Title: Quantum Tensor Networks for Machine Learning and Signal Processing
Bio: Jun Qi is currently a Ph.D. candidate in the School of ECE at Georgia Tech. He worked in Deep Learning Technology Center at Microsoft Research (2017), Tencent AI Lab (2019), and Mitsubishi Electric Research Laboratory (2020).
Abstract: The state-of-the-art machine learning (ML), particularly based on deep neural networks (DNN), has enabled a wide spectrum of successful applications ranging from the everyday deployment of speech recognition and computer vision to the frontier of scientific research in synthetic biology. Despite rapid theoretical and empirical progress in DNN based regression and classification, DNN training algorithms are computationally expensive for many new scientific applications, which requires computational resources even beyond the computational limits of classical hardware. The imminent advent of quantum computing devices opens up new possibilities of exploiting quantum machine learning (QML) to improve the computational efficiency of ML algorithms in new domains. With rapid development in quantum, hardware has motivated advances in QML to run in noisy intermediate-scale quantum (NISQ) devices, we employ hybrid quantum-classical models that rely on the optimization of parametric quantum circuits, which are resilient to quantum noise errors and admit many practical QML implementations on NISQ devices. In particular, we propose a novel end-to-end training pipeline consisting of the quantum tensor network (QTN) for the generation of quantum embedding and parametric quantum circuit (PQC) for model training. Our experiments on the tasks of image classification and speech recognition have been conducted to highlight the advantages of QTN-PQC.