*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Title: Detecting and Mitigating Human Bias in Visual Analytics
Emily Wall
Ph.D. Candidate in Computer Science
School of Interactive Computing
Georgia Institute of Technology
Date: Tuesday, April 14th, 2020
Time: 12:00-2:00 PM (EST)
BlueJeans: https://primetime.bluejeans.com/a2m/live-event/rfykeyvc
**Note: this defense is remote-only due to the institute's guidelines on COVID-19**
Committee:
Dr. Alex Endert (Advisor), School of Interactive Computing, Georgia Institute of Technology
Dr. John Stasko, School of Interactive Computing, Georgia Institute of Technology
Dr. Polo Chau, School of Computational Science and Engineering, Georgia Institute of Technology
Dr. Brian Fisher, School of Interactive Arts and Technology, Simon Fraser University
Dr. Wenwen Dou, Department of Computer Science, University of North Carolina - Charlotte
Abstract:
Visual Analytics combines the complementary strengths of humans (perception and sensemaking capabilities) and machines (fast and accurate information processing). However, people are susceptible to inherent limitations and biases, including cognitive biases (e.g., anchoring bias), social biases borne of cultural stereotypes and prejudices (e.g., gender bias), and perceptual biases (e.g., illusions). These biases can impact decision making in critical ways, leading to inaccurate or inefficient choices, or even propagating long-standing institutional and systemic biases.
Given our knowledge of these biases and the increased use of data visualization for decision making, the goal of this research is to detect and mitigate human biases in visual data analysis. In this dissertation, I describe (1) which types of bias are particularly relevant in the process of visual data analysis, (2) how user interactions with data can be used to approximate human biases, and (3) how visualization systems can be designed to increase user awareness of potentially unconscious or implicit biases. By creating systems that promote real-time awareness of bias, people can reflect on their behavior and decision making and ultimately engage in a less-biased decision making process.