Beacon

Empowering blind and low vision users to navigate more independently

Role

Product Designer

Team

2 Contributors

Timeline

8 Weeks

Beacon app navigation screens showing turn-by-turn directions to Warren Library with obstacle warnings
Overview

Beacon was a Human-Computer Interaction class project that reimagines how blind and low vision individuals navigate their surroundings. Through a pair of smart glasses and a navigation companion app, Beacon provides intuitive, real-time guidance needed for safer and independent mobility.

Challenge

How might we support blind and low vision individuals to navigate independently?

Over 285 million people worldwide live with blindness and low vision. Traditional aids, canes and guide dogs, can't catch all hazards like slippery surfaces, overhead obstacles, or landmarks that help with navigation. These gaps make simple tasks like navigating a busy street or finding an entrance needlessly difficult. Life can also feel isolating.

A visually impaired person navigating with a guide dog and walking cane

Traditional mobility aids have limitations

Solution

Real-time guidance for independent navigation

Beacon features smart glasses and a mobile app companion designed for safer real-time navigation guidance to help users move around with confidence and safety.

Beacon mobile companion app showing turn-by-turn navigation to Warren Library
Beacon smart glasses with embedded camera for real-time obstacle detection
How Beacon Works

Simple. Powerful. Empowering

Simple voice commands

Users say where they'd like to go using simple voice commands

Glasses detect obstacles

Glasses detect obstacles and deliver audio for directions, alerts

Haptics enhance safety

Vibrations guide users through turns and around obstacles

Features

Here's a closer look at the key features

Beacon voice interface asking where the user would like to go
01

Fully accessible audio interface

The fully accessible audio interface enables seamless interaction for users through simple voice commands

02

Alerts help avoid obstacles

Beacon detects obstacles and provides alerts ahead of time to help users avoid collisions

Beacon smart glasses with embedded cameras for obstacle detection
Beacon app showing crosswalk alert and turn-by-turn navigation
03

Clear directions along the way

Beacon provides users with clear audio directions along every step of the way

04

Haptic feedback for safer navigation

Haptic feedback (vibration) provides an additional layer for safety and awareness of surroundings

Beacon app showing passing the cafeteria landmark notificationBeacon app showing uneven sidewalk warning alert
Process

How Beacon came together

Need-finding

  • Desk research
  • Observational research*
  • Problem definition

Ideation

  • Brainstorming
  • Concept development
  • Competitive analysis

Prototyping

  • Beacon hardware design
  • Beacon app design
  • Interactive prototypes

Testing & Evaluation

  • Concept validation*
  • Design critique*
  • Design iterations
Need-finding

Understanding the challenges users face

We started with a broad question: "How might we use technology to empower blind and low vision individuals?" Research helped us narrow down the problem space.

Desk Research

We found a staggering reality: 80% of our perception relies on sight. For people living with vision loss, moving around can be very difficult, affecting their independence and social life.

Observational study*

We conducted an observational study with six participants navigating while blindfolded. We saw quickly how mobility quickly became a problem for participants.

Key insight: Navigation emerged as the most critical challenge. Narrowing down our scope, the question then became: How might we empower visually impaired users to navigate independently?

Ideation

Exploring concepts for real-time navigation guidance

Based on our research, we explored three concepts and evaluated them against four key requirements. Smart glasses with a mobile app companion met all four criteria.

ConceptVoice interactionAudio guidanceObstacle detectionSpatial awareness
Smart glasses + app
Smart cane with sensors
Mobile audio app
Design Exploration

What interaction pattern works best for users?

With smart glasses selected, we prototyped to figure out the interaction pattern and form factors that would be best for users. Drawing from the patterns we were already familiar with, we designed the first iteration. But design critique revealed three key issues

First iteration

First iteration glasses design with earbuds extending from the temples and a front-facing camera

Glasses with sensors and earbuds

Our initial design of the glasses featured earbuds that extended from the temples. Design critique revealed earbuds blocked ambient noise, which is actually useful for safer navigation

One critical issue emerged with the glasses

1

The earbuds blocked noise from the surrounding environment

First iteration mobile app showing Set Up, Visual Preferences, Home, and Voice Assistant screens with numbered usability issues

Mobile companion app

The first iteration of the app had a visual onboarding flow, touch controls, and a voice-enabled navigation assistant.

2

With small touch targets and a text heavy onboarding, the design felt as if it's for sighted users

3

Unclear entry point for the "Voice Assistant" feature and lack of feedback

Informed by the critique we received, we iterated our designs to address the issues that emerged

Second iteration

Embedded speakers in the glass temples

We replaced the earbuds with embedded speakers in glass temples to ensure safety

Second iteration glasses with speakers embedded into glass temples and front-facing camera

Redesigned for a voice-driven interaction model

We also redesigned the interface for a voice-driven interaction pattern and removed all onboarding, immediate access to navigation guidance

Redesigned voice-first interface showing Hi, where would you like to go?
Navigation screen showing crosswalk alert and turn-by-turn directions

Incorporated clear audio navigation guidance, haptic feedback, and spatial alerts

We redesigned for much safer navigation by including clear audio directions, haptic feedback (phone vibrations) to confirm turns and signal obstacles, along with spatial cues to help users understand their surroundings

Final Designs

Real-time navigation guidance for blind and low vision users

Through further design evaluation, we learned clear audio navigation guidance isn't enough. Our final design has audio and haptics feedback working together to enhance safety.

Final design smart glasses with embedded speaker and front-facing camera

Smart glasses detect obstacles and deliver all direction guidance.

Navigation screen showing crosswalk alerts and spatial cues for buildings and landmarksNavigation screen showing obstacle warnings like uneven sidewalk

Audio, haptic, and spatial feedback all work together to help users move around safely.

Impact

Design principles for assistive navigation

Our design work defined principles for designing assistive navigation that prioritize safety and independence.

01

Use multiple forms of feedback

Provide audio, spatial, and tactile feedback for more accessibility.

02

Let users hear surrounding noise

Allow users to hear surrounding noise for additional safety.

03

Give users advance warning

Alert about obstacles in advance for timely and safe response.

Reflection

What did I learn from this work?

Working on Beacon taught me design lessons that are still valuable in my work today.

1

Validate early and often

Each iteration revealed gaps in our assumptions—what seemed obvious to us wasn't always what users needed.

2

Partner with users from day one

If I were to do this again, I'd partner with actual blind users right from the beginning. Blindfolded proxies can't represent lived daily experiences.