Jon Sanford and Industrial Design students attend ASSETS 2012 in Colorado

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Contact

Summer Ienuso

 summer.ienuso@gatech.edu

Sidebar Content
No sidebar content submitted.
Summaries

Summary Sentence:

No summary sentence submitted.

Full Summary:

Tina Lee, Xiao Xiong, Elaine Liu, and Jon Sanford attended the 14th International ACM SIGACCESS Conference on Computing and Accessibility (ASSETS 2012), Boulder, Colorado, 22-24th October.

Media
  • Xiong, Sanford, Liu, and Lee at ASSETS 2012 in Colorado Xiong, Sanford, Liu, and Lee at ASSETS 2012 in Colorado
    (image/jpeg)
Related Files

ACM SIGACCESS Conference on Computing and Accessibility (ASSETS 2012)

Tina Lee, Xiao Xiong, Elaine Yilin, and Jon Sanford attended the 14th International ACM SIGACCESS Conference on Computing and Accessibility (ASSETS 2012), Boulder, Colorado, 22-24th October.

http://www.sigaccess.org/assets12/accepted_posters.html

EZ Ballot with Multimodal Inputs and Outputs

Tina, Xiao, and Elaine demonstrated and presented a multimodal ballot prototype called “EZ Ballot,” which provides multimodal inputs and outputs to accommodate diverse voters including those with visual, cognitive, and dexterity impairments. The linear layout of the EZ ballot structure fundamentally re-conceptualizes ballot design to provide the same simple and intuitive voting experience for all voters, regardless of ability or input/output (I/O) device used. Three students are currently conducting a formative study to identify the usability issues.

Gesture Interface Magnifiers for Low-Vision Users

Tina presented her previous work that compared different types of magnification and navigation methods on low-vision handheld magnifiers to determine the feasibility of a touch screen gesture interface. The results show that gestures were faster and more preferred than the indirect input methods for pushing a button or rotating a knob. The study suggests that the use of gestures may afford an alternative and more natural magnification and navigation method for a new user-centric low vision magnifier.

 

Additional Information

Groups

CATEA - Center for Assistive Technology and Environmental Access

Categories
Exhibitions
Related Core Research Areas
No core research areas were selected.
Newsroom Topics
No newsroom topics were selected.
Keywords
2012, ASSETS, Lee, Sanford, SIGACCESS, Xiong, Yilin
Status
  • Created By: Summer Ienuso
  • Workflow Status: Published
  • Created On: Jan 28, 2013 - 9:33am
  • Last Updated: Oct 7, 2016 - 11:13pm