*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Title: Combining Natural Language and Direct Manipulation for Human-Data Interaction through Visualizations
---------------------------------
Arjun Srinivasan
Ph.D. Candidate in Computer Science
School of Interactive Computing
Georgia Institute of Technology
Date: Wednesday, July 29th, 2020
Time: 3:00-5:00 PM (ET)
Meeting URL: https://gatech.webex.com/gatech/j.php?MTID=m6dbfab8d4767f8e03e924557b05fb6bc
**Note: this defense is remote-only**
Committee:
---------------------------------
Dr. John Stasko (Advisor), School of Interactive Computing, Georgia Institute of Technology
Dr. Alex Endert, School of Interactive Computing, Georgia Institute of Technology
Dr. Keith Edwards, School of Interactive Computing, Georgia Institute of Technology
Dr. Bongshin Lee, Microsoft Research
Dr. Vidya Setlur, Tableau Research
Abstract:
---------------------------------
Visualization is an indispensable tool for human-data interaction, enabling people to better understand their data, identify patterns, and discover insights. Interaction plays a critical role in data visualization tools as it allows users to express their data-related goals and questions to the system. Traditionally, interaction in visualization tools is facilitated predominantly via a keyboard and mouse, following the window-menu-icon-pointer (WIMP) metaphor and the direct manipulation paradigm. However, recent advances in hardware and input recognition technology present the opportunity to reimagine interaction and explore new user experiences grounded in naturalistic human ways of interacting with people and objects in the real world.
This thesis explores the design of a novel class of multimodal data visualization interfaces that augment current interaction techniques in visualization systems with natural language. I start with an assessment of the role of natural language input in visualization tools, characterizing the goals it can help people accomplish. Subsequently, I describe the design and implementation of a series of multimodal visualization systems that combine natural language and direct manipulation. Through evaluations of these systems, I capture the strengths and challenges of multimodal visualization interfaces, and highlight how they accommodate varying user interaction patterns and preferences during visual analysis. Finally, to assist future research and development, I also contribute a toolkit to help designers/developers prototype natural language-based visualization systems.