PhD Defense by Christopher R. McBryde

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Thursday May 3, 2018
      3:30 pm - 5:30 pm
  • Location: Montgomery Knight 317
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: SPACECRAFT VISUAL NAVIGATION USING APPEARANCE MATCHING AND MULTI-SPECTRAL SENSOR FUSION

Full Summary: No summary paragraph submitted.

Doctoral Defense

by

Christopher R. McBryde

Advisor: Prof. E. Glenn Lightsey

SPACECRAFT VISUAL NAVIGATION USING APPEARANCE MATCHING AND MULTI-SPECTRAL SENSOR FUSION

May 3, 2018, 3:30 pm, Montgomery Knight 317

One of the capabilities necessary for a successful satellite mission is knowledge of its location and orientation in space, especially relative to a target. Relative navigation is an enabling technology for spacecraft formation flying, rendezvous and docking, and hazard avoidance. Cameras are particularly useful for this task since they are less expensive, smaller, and have lower power requirements than many other types of sensors. Object identification and relative pose estimation is therefore a key topic of research for the future of spacecraft. Using cameras for object identification and relative pose estimation presents a few challenges. Obtaining relative position and orientation data is a two-step process. An object must first be identified so that the image data can provide a meaningful relative pose. Historically, the complete relative navigation process has involved two different algorithms, one for object identification and another for pose estimation, working in tandem. Finally, images in the visible spectrum are susceptible to variations in illumination that affects the perceived shape of the object, if it can be imaged at all.

 

The approach taken in this research is to apply terrestrial techniques to improve spacecraft navigation. First, appearance matching is used as a common framework for both object identification and pose estimation and is made more robust using background randomization. Consequently, a spacecraft imaging simulation environment is created to both generate the necessary training images as well as verify the systems performance. Additionally, results for multiple sensors are fused to improve the identification and pose estimation as well as increase the operating range over more of the orbit. The result of this research is that a robust method is demonstrated for object identification and pose estimation of a spacecraft target. A single framework accomplishes both tasks and may be further enhanced using multiple sensors. Appearance matching and sensor fusion will help enable the next generation of spacecraft visual navigation.

 

Committee Members:

Dr. Glenn Lightsey, School of Aerospace Engineering (advisor)

Dr. Marcus Holzinger, School of Aerospace Engineering

Dr. Eric Johnson, School of Aerospace Engineering

Mr. Chad Frost, Ames Research Center

Dr. Andrew Johnson, Jet Propulsion Laboratory

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Public, Graduate students, Undergraduate students
Categories
Other/Miscellaneous
Keywords
Phd Defense
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Apr 16, 2018 - 11:26am
  • Last Updated: Apr 16, 2018 - 11:26am