MS proposal by Sidney Scott-Sharoni

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Tuesday August 23, 2022
      3:00 pm - 5:00 pm
  • Location: TEAMS
  • Phone:
  • URL: Teams
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Directability Through AI Customization: The Effect of Choice on Trust and Acceptance in Highly Automated Vehicles

Full Summary: No summary paragraph submitted.

Name: Sidney Scott-Sharoni 

Master’s Thesis Proposal Meeting
Date: Tuesday, August 23rd, 2022
Time: 03:00 PM
Location: Microsoft Teams  click here

 
Advisor:
Bruce Walker, Ph.D. (Georgia Tech)
 
Thesis Committee Members: 
Bruce Walker, Ph.D. (Georgia Tech) 

Jamie Gorman, Ph.D. (Georgia Tech)

Richard Catrambone, Ph.D. (Georgia Tech) 


Title: Directability Through AI Customization: The Effect of Choice on Trust and Acceptance in Highly Automated Vehicles

  

Abstract: Self-driving cars have the potential to revolutionize mobility and transportation. However, research indicates people feel apprehensive about using or relying on highly automated vehicles (American Automotive Association, 2019). One method of assuaging fears involves providing explanations for the system’s behaviors using a Human-Machine Interface (HMI).  However, understanding the amount of information to display to generate optimal human-automation interaction can prove difficult due to differences in individuals’ preferences, experiences, and needs. An underexplored method that may account for these discrepancies involves providing users with choices or customization. The Coactive Design approach suggests that including directability, or the power to influence a system’s actions, may improve how users interact with a system (Johnson et al., 2014). The following study will investigate how customization affordances for either the behavior or the explainability of an automated vehicle affects trust and acceptance in the AI. One hundred-twenty participants will experience one highly automated simulator drive, during which they will engage in a visually demanding side activity and have their gaze behavior recorded as a measure of trust. Trust and acceptance will also be reported using subjective measures, specifically the Trust in Automation Questionnaire (Jian et al., 2000) and the Technology Acceptance Model (Davis, 1989). A MANCOVA will assess the interaction of and main effects of customization availability and customization type on trust and acceptance. Investigating the role of choice in the interaction between individuals and highly automated systems can provide insight into the psychological impacts of having control over technology and how that influences users to interact with it most effectively.

 

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Faculty/Staff, Public, Undergraduate students
Categories
Other/Miscellaneous
Keywords
MS Proposal
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Aug 4, 2022 - 5:00pm
  • Last Updated: Aug 4, 2022 - 5:00pm