*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Title: The Algorithm Keeps The Score: Identity, Marginalization and Power in the Technology-Mediated Search for Care
Date: November 10, 2022
Time: 11:00am - 1:30pm EST
Location (in-person): Coda 1015 Vinings
Location (remote): Link here, with Zoom Meeting ID: 998 8541 2467, and Zoom Passcode: 698075
Sachin Pendse
PhD Student in Human-Centered Computing
School of Interactive Computing
Georgia Institute of Technology
Committee:
Dr. Munmun De Choudhury (co-advisor) – School of Interactive Computing, Georgia Institute of Technology
Dr. Neha Kumar (co-advisor) – School of Interactive Computing and Sam Nunn School of International Affairs, Georgia Institute of Technology
Dr. Michael Best – School of Interactive Computing and Sam Nunn School of International Affairs, Georgia Institute of Technology
Dr. Andrea Grimes Parker – School of Interactive Computing, Georgia Institute of Technology and Department of Behavioral Sciences and Health Education, Rollins School of Public Health, Emory University
Dr. Amit Sharma – Microsoft Research
Dr. Jessica Schleider – Department of Psychology, Stony Brook University
Summary:
Severe psychological distress and mental illness are widespread, affecting over a billion people globally, with extremely diverse presentations. One factor with a strong influence on these presentations is the identity of the individual in distress. Expressions of distress, and how they are perceived by the community, are key to the resources that an individual is able to access. As a result, identity has a strong influence on the pathway that an individual takes from experiencing distress to searching for care. Technology-mediated mental health support (TMMHS) systems, such as helplines or personalized search engine results, have become one such means to find and receive care.
In my doctoral research, I study how engagements with TMMHS systems influence how individuals in distress come to understand their experiences and engage with care, particularly given the impact of identity-based oppression. Identity, particularly marginalized identity, has a strong influence on both experiences of mental illness and on engagements with algorithmic systems. In my work, I thus turn a specific eye to how the personalized design of TMMHS systems might be helpful (such as through matching individuals in need to relevant resources) and how personalization might be harmful (through segregating people away from diverse forms of support or community connection). I leverage mixed methods in these analyses, analyzing biases in TMMHS systems from a technical lens and from the lens of lived experiences of those who use TMMHS systems.
In my completed work, I have used qualitative interviews and computational data (social media posts and online searches) to study the role of identity in how people engage with TMMHS systems. I have found that identity-bound expressions of distress have a strong influence on how people come to understand their illness experience. In my proposed work, I will study gender and racial biases in predictive algorithms for mental health, and conduct algorithm audits of differences (based on identity) in search engine results for mental health resources.
My work is informed by culturally valid frameworks centered around the role of oppression, conflict, and colonialism in how mental health care has been provided throughout history. Through my doctoral work, done in partnership with several mental health advocacy organizations, I contribute an understanding of where identity-based personalization may be helpful in TMMHS systems, and where it may inadvertently deny people care.