PhD Defense by Alexander William Clegg

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Friday May 10, 2019 - Saturday May 11, 2019
      12:00 pm - 1:59 pm
  • Location: TSRB 223
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Modeling Human and Robot Behavior During Dressing Tasks

Full Summary: No summary paragraph submitted.

Title: Modeling Human and Robot Behavior During Dressing Tasks

 

Alexander William Clegg

School of Computer Science

College of Computing

Georgia Institute of Technology

 

Date: Friday, May 10th, 2019

Time: 12:00PM (noon) EDT

Location: TSRB 223

 

Committee:

---------------

Dr. Karen Liu (Advisor, School of Computer Science, Georgia Tech)

Dr. Greg Turk (Advisor, School of Computer Science, Georgia Tech)

Dr. Charlie Kemp (School of Biomedical Engineering, Georgia Tech)

Dr. Jarek Rossignac (School of Computer Science, Georgia Tech)

Dr. Sonia Chernova (School of Computer Science, Georgia Tech)

 

 

Abstract:

------------

Human dressing assistance tasks present a multitude of privacy, safety, and independence concerns for the daily lives of a vast number of individuals across the world, providing strong motivation for the application of assistive robotics to these tasks. Additionally, the challenge of manually generating animations in which virtual characters interact with animated or simulated garments has resulted in the noticeable absence of such scenes in existing video games and animated films, motivating the application of automated motion synthesis techniques to this domain. However, cloth dynamics are complex and predicting the results of planned interactions with a garment can be challenging, which makes manual controller design difficult and makes the use of feedback control strategies an attractive alternative. The focus of this thesis is the development of a set of techniques for behavior modeling and motion synthesis in the space of human dressing.

 

We first consider motion synthesis primarily in the space of self-dressing. We propose a kinematic motion synthesis technique which automatically computes the motion of a virtual character while successfully executing a dressing task with a simulated garment. Next, we explore the impact of haptic (touch) observation modes on the self-dressing task and present a deep reinforcement learning (DRL) approach to navigating simulated garments. We then present a unified DRL approach to self-dressing motion synthesis in which neural network controllers are trained via Trust Region Policy Optimization (TRPO). Finally, we investigate the extension of haptic aware feedback control and DRL to robot assisted dressing. We present a universal policy method for modeling human dressing behavior under variations in capability including: muscle weakness, Dyskinesia, and limited range of motion. Using this method and behavior model, we demonstrate the discovery of successful strategies for a robot to assist humans with a variety of capability limitations.

 

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Faculty/Staff, Public, Graduate students, Undergraduate students
Categories
Other/Miscellaneous
Keywords
Phd Defense
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Apr 24, 2019 - 11:42am
  • Last Updated: Apr 24, 2019 - 11:42am