We chose to design for people that are blind or vision impaired with the aim to create a prototype that would empower them and that could also be helpful for other members of the public. We began by doing background research to understand our target users and to analyse the systems and solutions that are currently in place to assist them.
We found that despite making steps in the right direction with smart infrastructure, most modern cities failed to sufficiently meet the needs of those with vision impairment. In particular, cities in Australia have areas in which they can improve accessibility and meet the needs of the 575,000 blind or vision impaired people living in Australia.
We also conducted online interviews with people that were blind and vision impaired. They gave us background on issues that they faced daily, particularly in regards to travelling and navigating outside.
Visual impairment and restriction of environment is reported to contribute to varying degrees of loneliness, depression, and social isolation. A lack of accessibility in infrastructure can contribute to this with many parts of modern cities still being prohibitive and restrictive to vision impaired people.
Those with vision impairments often spend extra time preparing for their journeys and making more decisions and adjustments during a trip. Despite there being tools available to assist them while navigating (such as guide dogs and long canes), there is still a lack of real-time wayfinding information available to help guide them during a commute.
Although there were issues raised in our research about public transport (particularly about correctly identifying and using the right routes), the main issue pertained to getting to and from public transport stops. Poor accessibility and changing variables on ill defined footpaths as well as a lack of haptic and audio signals at intersections and crossing points were a main source of frustration for vision impaired people when trying to access public transport.
Mobility Related Accidents
A study found that 50% of blind people experienced head accidents at least once a month. Most people with vision impairment end up losing their confidence and change their walking habits as a result of these or similar types of accidents. Even with long canes and guide dogs, accidents can still occur when walking outside and in public.
Defining the users
After examining the main sources of frustration that vision impaired people experience we decided to define a persona. This was based off the research we had done as well as the experiences and characteristics of our interviewees. The persona was used to inform us when we created our initial prototypes as well as help us choose which one to move forward with.
Framing the problem
The research we had conducted helped us develop our problem statement. It was based on the key needs of vision impaired people and in particular focused on frustrations they experience when navigating outside in public spaces.
The vision impaired require assistance to navigate modern cities independently. This includes assistance in orienting themselves, identifying hazards and using existing public transport and public infrastructure.
Each of us began ideating different concepts that would address our problem until we narrowed down to one concept each that we could test as a low fidelity prototype and compare the results across each of them. Although each of the concepts were made in response to the brief and were informed by our research, we tried to create concepts that addressed different areas of navigation outside.
A smart watch designed to help the vision impaired navigate to destinations by guiding them to incremental checkpoints (such as a bus stop, train gate or platform). Users are guided through a series of haptic vibrations that increase in rate and intensity as the user approaches a checkpoint.
A dynamic, updating braille map which makes location-based wayfinding applications accessible to the blind. This braille map would be produced by mechanically pushing out pins on a 2D plane to represent the user’s surroundings.
A necklace with a device attached to it that uses ultrasonic sensors to detect any obstructions in front of the wearer that are torso to head height. There are vibration motors embedded onto each side of the necklace that vibrate depending on what side the obstruction is and how far away it is.
A dynamic pavement solution using vibrations to alert users on obstructions and dangers they may face ahead. Messages are updated based on sensors placed around the pavement that give constant and accurate feedback to its users.
Simple low fidelity prototypes were created for each concept excluding the Braille Map (due to technology and time limitations). Our plan was to recruit participants to engage with each prototype in a think aloud exercise, interview them afterwards to further explore their reactions and have them fill out a short survey to rate each concept.
Initially we tried to recruit participants with blindness and vision impairments, however the interviewees we had initially spoken to were busy during this period and other leads we followed up with didn’t respond to us. Because of this we recruited any participants available from different backgrounds and had them wear blacked out goggles during the test to simulate temporary blindness.
The think aloud exercises were filmed along with post-exercise interviews so that we could rewatch and note the completion times and reactions from the participants. We also made the survey ask the participants to rate each concept based on helpfulness, its perceived utility and ease of use so that we could get a more balanced overall rating.
We performed a Wizard of Oz method of testing to validate the key interaction of navigating vision impaired users through haptic feedback from a watch. To perform the test we put blinding goggles and a moto 360 watch on our subjects and tasked them with finding an X written on paper. The vibrations were controlled remotely by an app on a smartphone. The tester would send an increasing number of vibrations as the participant approached the X.
The test environment was a hallway where strings were tied at torso height on different sides. Participants had to navigate through the hallway, avoiding the strings while two different vibration sounds were played behind them to indicate whether they should move left or right. The purpose was to see how each test subject would react to audio indicators guiding them without any other aids.