PhD Proposal by Zhaoyang Lv

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday December 12, 2018 - Thursday December 13, 2018
      10:00 am - 11:59 am
  • Location: GVU cafe, TSRB
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Visual Dense Three-Dimensional Motion Estimation in the Wild

Full Summary: No summary paragraph submitted.

Title: Visual Dense Three-Dimensional Motion Estimation in the Wild

 

Date: Wednesday, December 12, 2018

Time: 10:00 am to 11:30 am (EST)

Location: GVU cafe, TSRB

 

Zhaoyang Lv

School of Interactive Computing, College of Computing

Georgia Institute of Technology

https://www.cc.gatech.edu/~zlv30/ 

 

Committee: 

Dr. James
M. Rehg (Advisor, School of Interactive Computing, Georgia
Institute of Technology)

Dr. Frank
Dellaert (Co-Advisor, School of Interactive Computing, Georgia
Institute of Technology)

Dr. James
Hays (School of Interactive Computing, Georgia
Institute of Technology)

Dr. Zsolt
Kira (School of Interactive Computing, Georgia
Institute of Technology; Georgia Tech Research
Institute)

Dr. Andreas
Geiger (Autonomous Vision Group, Max Planck Institute Intelligent System;
University of Tuebingen)

 

Abstract: 

One of the most fundamental ability of the human perception system is to seamlessly sense the changing 3D worlds from our ego-centric visual observations. Driven by the modern applications of robotics, autonomous driving, and mixed reality, the machine perception requires a precise dense representation of 3D motion with low latency. In this thesis, I focus on the task of estimating absolute 3D motions in world coordinate in unconstrained environments from ego-centric visual information only. The goal is to achieve a fast algorithm that can produce a dense and accurate representation of the 3D motions.

 

To achieve this goal, I propose to investigate the problem from four perspectives with the following contributions:

1) Present a fast and accurate continuous optimization approach that solves the 3D scene motions as moving fixed-a-priori planar segments.

2) Present a learning-based approach that recovers the dense scene flow from ego-centric motion and optical flow, decomposed by a novel data-driven rigidity prediction. 

3) Present a modern synthesis of the classic inverse compositional method for 3D rigid motion estimation using dense image alignment.

4) I will propose a novel object-centric scene flow representation incorporating top-down recognition. Specifically, scene flow of each instance composes 3D rigid motion basis and local deformation field. The proposed approach will progressively predict local deformation of the rigid instance motion to align the instance across views.

 

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Public, Graduate students, Undergraduate students
Categories
Other/Miscellaneous
Keywords
Phd proposal
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Dec 10, 2018 - 9:06am
  • Last Updated: Dec 10, 2018 - 9:06am