The Proposal
“An estimated 1,000,000 people in Canada and the United States have limited or no use of their arms - meaning they are unable to use touchscreen devices that could provide access to helpful apps and services. While solutions exist for desktop computers, they can cost up to $3,000 and do not work well on mobile devices.” (The Neil Squire Society, 2016)
Above is a user testing the LipSync Joystick.

The Problem
Our challenge was to create an assistive menu application to assist LipSync users to access common features of their device which are usually available through hardware buttons or touch-based gestures. This will improve accessibility to features that are difficult to access for people with limited finger and hand dexterity.

The Opportunity
Our research revealed that 91% of people with disabilities own either a smartphone or tablet (The Neil Squire Society). With assistive technology like the LipSync, they would be able to operate their devices by themselves, empowering their sense of self achievement.
Instead of using touch, they would navigate the device with a mouse cursor controlled by a joystick. However, this solution had its own limitations as well, such as typing speed, pinch-to-zoom, and any other touch based gestures. According to SUN (Survey of User Needs), the most common activities include texting, internet, email, social media, and maps/GPS. We came up with several possible features and gestures that would need to be added to the overlay in order to perform all these functions with as little limitations as possible, including: Radial display, Customizable quick bar, Focus states








The Process
During a 3 week research period, we conducted user tests with members of the Neil Squire Society. We tested 3 users of different mobile device knowledge and experience.


This is our second user. He is seen in this picture listening to an explanation of our solution.



















