*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Title: Novel gestures for wearables
Cheng Zhang
Computer Science Ph.D. Candidate
School of Interactive Computing
College of Computing
Georgia Institute of Technology
Date: March 12th, 2018 (Monday)
Time: 9:00am to 12:00pm (EST)
Location: TBD TSRB
Committee:
---------------
Dr. Gregory D. Abowd (Co-Advisor, School of Interactive Computing, Georgia Tech)
Dr. Omer Inan (Co-Advisor, School of Electrical and Computer Engineering, Georgia Tech)
Dr. Thad Starner (School of Interactive Computing, Georgia Tech)
Dr. Thomas Ploetz (School of Interactive Computing, Georgia Tech)
Dr. Chris Harrison (Human Computer Interaction Institute, Carnegie Mellon University)
Abstract:
-----------
Wearable computing is an inevitable part of the next generation of computing. Compared with traditional computers (e.g., laptop, smartphones), wearable devices are much smaller, creating new challenges for the design of both hardware and software. Providing appropriate input capabilities for wearables is one such challenge. The input techniques that have been proven efficient on traditional computing devices (e.g., keyboard, touchscreen) are no longer appropriate for wearables due to various reasons. One is the inherently small size of wearables. For instance, it is impossible to place a physical keyboard on a wearable device. Most of the commodity wearable devices such as the smartwatch and Google Glass adopt a touch-based input solution, which suffers from the small operation area as well as the limited richness of input vocabulary. The other reason is the more dynamic working environment the wearables are exposed to. For instance, wearable devices are expected to be functional and efficient even when the user is in motion (e.g., walking). Traditional input devices are no longer appropriate to address these challenges. Compared with input on the physical keyboard or touchscreen, gesture-based input provides the user with much larger freedom of operation, which can potentially improve the interaction experience.
In this thesis, I explore designing and implementing various novel gestures to address the input challenges on wearables: from using the built-in sensors of an off-the-shelf device to building customized hardware; from 2D on-body interaction to 3D input with a larger freedom of operation; from recognizing pre-defined and discrete hand or finger gestures with machine learning to providing continuous input tracking through a deeper understanding of physics. I start with exploring the natural and novel input gestures that can be supported by using only the built-in sensors of a smartwatch. I describe WatchOut and TapSkin, which allow user input on the watch case, band, and the skin around the watch. Though using only the built-in sensors is more practical, the richness of input gestures and performance of recognition are not optimized because of the limited choice of sensors. To better address the input challenge of wearable, I designed and implemented another sets of wearable input techniques using customized hardware (e.g., a thumb-mounted ring), which provides new input gestures for wearable that are not available on a commodity device, such as, input number digits, Graffiti-style characters, menu selection and quick response with the protection of users' privacy. However, these complementary gestures can only partially improve the interaction experience. To fundamentally address the input challenge on wearables, new interaction paradigms are needed to replace the touch-based on-device interaction. Such an interaction paradigm is usually comprised of various low-level input events of high resolution. For instance, the press and release of the mouse keys are the low-level input events for the WIMP interface. This leads to the last section of my dissertation work: providing low-level input events for wearables, such as continuously tracking of the position of different body parts of interest in 3D space using wearables. I also demonstrate how such low-level input event can be used to design interaction on wearables.