*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Date: Tuesday, March 14, 2023
Time: 11:00 a.m. - 12:00 p.m.
Location: TSRB 134
Speaker: Sevgi Zubeyde Gurbuz
Speakers' Title: Assistant Professor, University of Alabama
Seminar Title: Deciphering the Unseen: Radar-Enabled In-Home Health Monitoring
Abstract: As technology advances and an increasing number of devices enter our homes and workplace, humans have become an integral component of cyber-physical systems (CPS). One of the grand challenges of cyber-physical human systems (CPHS) is how to design autonomous systems where human-system collaboration is optimized through improved understanding of human behavior. A new frontier within this landscape is afforded by the advent of low-cost, low-power millimeter (mm)-wave RF transceivers, which enables the exploitation of RF sensors almost anywhere as part of the Internet-of-Things (IoT), smart environments, personal devices, and even wearables. RF sensors not only provide sensing capability when other sensors may be ineffective due to environmental factors, but also provide unique spatio-kinematic measurements that are complementary to that of other sensing modalities. Moreover, in indoor environments where privacy is also a driving consideration, RF sensors offer relatively non-intrusive perception capabilities. Consequently, there have been exciting recent advancements in the use of RF sensing for remote health monitoring in homes and assisted living facilities. Since the first research in radar-based human activity recognition over 15 years ago, where the technology was demonstrated in controlled lab settings, now radar can be found in many new devices hitting the market. This includes the Google SOLI sensor in cell phones for non-contact gesture recognition, as well as products under development by Amazon, Vayyar and others for sleep monitoring, vital sign monitoring, and occupancy recognition. However, these applications only begin to touch the surface of the potential for radar-enabled cyber-physical human systems (CPHS) for health monitoring. Future intelligent devices equipped with cognitive perception and learning will be able to much more effectively and robustly decipher and respond to complex human behaviors. This talk introduces radar-based perception of human movements, especially physics-aware machine learning perspectives that enable improved performance with less data, which can help overcome current limitations and pave the way for future radar-enabled interactive environments.
Biographical Sketch of the Speaker:
As technology advances and an increasing number of devices enter our homes and workplace, humans have become an integral component of cyber-physical systems (CPS). One of the grand challenges of cyber-physical human systems (CPHS) is how to design autonomous systems where human-system collaboration is optimized through improved understanding of human behavior. A new frontier within this landscape is afforded by the advent of low-cost, low-power millimeter (mm)-wave RF transceivers, which enables the exploitation of RF sensors almost anywhere as part of the Internet-of-Things (IoT), smart environments, personal devices, and even wearables. RF sensors not only provide sensing capability when other sensors may be ineffective due to environmental factors, but also provide unique spatio-kinematic measurements that are complementary to that of other sensing modalities. Moreover, in indoor environments where privacy is also a driving consideration, RF sensors offer relatively non-intrusive perception capabilities. Consequently, there have been exciting recent advancements in the use of RF sensing for remote health monitoring in homes and assisted living facilities. Since the first research in radar-based human activity recognition over 15 years ago, where the technology was demonstrated in controlled lab settings, now radar can be found in many new devices hitting the market. This includes the Google SOLI sensor in cell phones for non-contact gesture recognition, as well as products under development by Amazon, Vayyar and others for sleep monitoring, vital sign monitoring, and occupancy recognition. However, these applications only begin to touch the surface of the potential for radar-enabled cyber-physical human systems (CPHS) for health monitoring. Future intelligent devices equipped with cognitive perception and learning will be able to much more effectively and robustly decipher and respond to complex human behaviors. This talk introduces radar-based perception of human movements, especially physics-aware machine learning perspectives that enable improved performance with less data, which can help overcome current limitations and pave the way for future radar-enabled interactive environments.