PhD Proposal by Aman Parnami

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Thursday May 26, 2016 - Friday May 27, 2016
      2:00 pm - 3:59 pm
  • Location: TSRB 223
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Enabling Motion-Based Gestural Interaction Design

Full Summary: No summary paragraph submitted.

Title: Enabling Motion-Based Gestural Interaction Design

 

Aman Parnami

Ph.D. Student

School of Interactive Computing

College of Computing

Georgia Institute of Technology

http://amanparnami.com

 

Date: Thursday, May 26th, 2016

Time: 2:00PM - 4:00PM EST

Location: TSRB 223

 

Committee:

-------------------

Dr. Gregory D. Abowd, School of Interactive Computing, Georgia Tech

Dr. Betsy DiSalvo, School of Interactive Computing, Georgia Tech

Dr. Thad Starner, School of Interactive Computing, Georgia Tech

Dr. W. Keith Edwards, School of Interactive Computing, Georgia Tech

Dr. Björn Hartmann, Electrical Engineering & Computer Science, UC Berkeley

Dr. Yang Li, Google Research

 

Abstract:

------------------

Gestural input is inherently quick and expressive since a single motion can indicate the operation, the operand, and additional parameters. For instance, in the case of a smart watch, a simple turn of the wrist (gesture) quickly towards the eyes (parameters), expresses the wearer's intent to check the time (operation). As a result, the screen of the smart watch (operand) turns on to reveal the time. Motion-based gestures are particularly useful in mobile situations where visual attention is limited, or when brief interactions with wearable devices are needed. Smartphones and wearable devices can support motion-based gestures through now ubiquitous inertial sensors (e.g., accelerometers, gyroscopes). Despite ubiquitous support and proposed advantages, they are not as common as touchscreen input, particularly among interaction designers.

 

There are three primary challenges for interaction designers pertaining to gestural interaction design. First, a lack of pattern recognition expertise required for creation of gesture recognizers. Second, the absence of tools that support an end-to-end design process for gestural interaction. Third, the divide between the development and usage environments because traditional tools limit prototyping to controlled environments.

 

In this proposal, I describe a mobile motion-based gestural interaction design tool that supports in situ, context-aware prototyping. The proposed tool will enable rapid creation of novel and accurate gestures, and novel gestural interfaces for wearables. Furthermore, I present a plan for iteratively designing, developing, and evaluating this tool. With this tool, my hope is to empower interaction designers in their explorations of novel interfaces for emerging wearable and mobile platforms.

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Public
Categories
Other/Miscellaneous
Keywords
Phd proposal
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: May 24, 2016 - 8:16am
  • Last Updated: Oct 7, 2016 - 10:17pm