PhD Proposal by Keenan May

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Wednesday November 20, 2019 - Thursday November 21, 2019
      2:00 pm - 3:59 pm
  • Location: JS Coon Building, Room 150
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Impact of Action-Object Congruency on the Integration of Auditory and Visual Stimuli in Extended Reality

Full Summary: No summary paragraph submitted.

Name: Keenan May
Ph.D. Dissertation Proposal Meeting
Date: Wednesday, November 20, 2019
Time: 2:00pm
Location: JS Coon Building, Room 150

Advisor:
Professor Bruce N. Walker, Ph.D. (Georgia Tech)

Thesis Committee Members:

Professor Richard Catrambone, Ph.D. (Georgia Tech)

Professor Maribeth Gandy, Ph.D. (Georgia Tech)

Professor Thackery I. Brown, Ph.D. (Georgia Tech)

Professor Jamie C. Gorman, Ph.D. (Georgia Tech)

 

Title: Impact of Action-Object Congruency on the Integration of Auditory and Visual Stimuli in Extended Reality

 

Abstract: Understanding how humans integrate stimuli from different modalities into crossmodal objects is important for the development of effective extended reality (XR) systems. Through the process of multisensory integration, information from different sensory modalities is used to determine whether stimuli should be bound together into a unified percept. In XR environments, basic congruency of time and space may be insufficient. As such, the proposed research will investigate a new type of crossmodal congruency, called action-object congruency. Research in ecological sound perception has identified various action and object features of sound-producing events that humans can discern. As a result of multisensory perceptual learning processes, this information could be used to inform the integration of auditory and visual stimuli, even when those stimuli are novel. In the proposed research, temporal and spatial ventriloquism illusions will be utilized to assess the impact of action-object congruency. Within a virtual environment, participants will view objects interacting and hear sounds presented either from a slightly different location or with a slight delay. Participants will localize the sounds via pointing, or judge whether the auditory and visual events were simultaneous. Action and object congruency of stimuli will be manipulated. It is expected that action-object congruent visual and auditory pairings will lead to greater localization bias and higher rates of perceived simultaneity.

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Faculty/Staff, Public, Graduate students, Undergraduate students
Categories
Other/Miscellaneous
Keywords
Phd proposal
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Nov 19, 2019 - 10:39am
  • Last Updated: Nov 19, 2019 - 10:39am