PhD Defense by Daehyung Park

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday January 17, 2018 - Thursday January 18, 2018
      1:00 pm - 2:59 pm
  • Location: 3115 (McIntire Conference Room), BME Whitaker Building
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: A Multimodal Execution Monitoring System for Assistive Robots

Full Summary: No summary paragraph submitted.

Title: A Multimodal Execution Monitoring System for Assistive Robots

Daehyung Park
Robotics Ph.D. Student
School of Interactive Computing
College of Computing
Georgia Institute of Technology

Date: January 8th, 2018 (Monday)
Time: 1:00pm to 3:00pm (EST)
Location: 3115 (McIntire Conference Room), BME Whitaker Building

Committee:
---------------

Dr. Charles C. Kemp (Advisor), Biomedical Engineering, Georgia Institute of Technology & Emory University
Dr. Byron Boots, School of Interactive Computing, Georgia Institute of Technology
Dr. Sonia Chernova, School of Interactive Computing, Georgia Institute of Technology
Dr. James M. Rehg, School of Interactive Computing, Georgia Institute of Technology
Dr. Randy Trumbower, Harvard Medical School, Harvard University

Abstract:
-----------

Assistive robots have the potential to serve as caregivers, providing assistance with activities of daily living to people with disabilities. Monitoring when something has gone wrong could help assistive robots operate more safely and effectively around people. However, the complexity of interacting with people and objects in human environments can make challenges in monitoring operations. By monitoring multimodal sensory signals, an execution monitor could perform a variety of roles, such as detecting success, determining when to switch behaviors, and otherwise exhibiting more common sense.

The purpose of this dissertation is to introduce a multimodal execution monitor to improve safety and success of assistive manipulation services. To accomplish this goal, we make three main contributions. First, we introduce a data-driven anomaly detector, a part of the monitor, that reports anomalous task executions from multimodal sensory signals online. Second, we introduce a data-driven anomaly classifier that recognizes the type and cause of common anomalies through an artificial neural network after fusing multimodal features. Lastly, as the main testbed of the monitoring system, we introduce a robot-assisted feeding system for people with disabilities, using a general-purpose mobile manipulator (a PR2 robot).

We evaluate the monitoring system with haptic, visual, auditory, and kinematic sensing during household tasks and human-robot interactive tasks including feeding assistance. We show multimodality improves the performance of monitoring methods by detecting and classifying a broader range of anomalies. Overall, our research demonstrates the multimodal execution monitoring system helps the assistive manipulation system to provide safe and successful assistance for people with disabilities.

 

Here is a conference call for Skype: https://join.skype.com/PREf6T9BjIA5

 

However, the number of attendees will be limited to 25.

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Faculty/Staff, Public, Graduate students, Undergraduate students
Categories
Other/Miscellaneous
Keywords
Phd Defense
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Jan 2, 2018 - 12:51pm
  • Last Updated: Jan 19, 2018 - 10:34am