Beacon
Empowering blind and low vision users to navigate more independently
Role
Product Designer
Team
2 Contributors
Timeline
8 Weeks
Beacon was a Human-Computer Interaction class project that reimagines how blind and low vision individuals navigate their surroundings. Through a pair of smart glasses and a navigation companion app, Beacon provides intuitive, real-time guidance needed for safer and independent mobility.
How might we support blind and low vision individuals to navigate independently?
Over 285 million people worldwide live with blindness and low vision. Traditional aids, canes and guide dogs, can't catch all hazards like slippery surfaces, overhead obstacles, or landmarks that help with navigation. These gaps make simple tasks like navigating a busy street or finding an entrance needlessly difficult. Life can also feel isolating.

Traditional mobility aids have limitations
Real-time guidance for independent navigation
Beacon features smart glasses and a mobile app companion designed for safer real-time navigation guidance to help users move around with confidence and safety.
Simple. Powerful. Empowering
Simple voice commands
Users say where they'd like to go using simple voice commands
Glasses detect obstacles
Glasses detect obstacles and deliver audio for directions, alerts
Haptics enhance safety
Vibrations guide users through turns and around obstacles
Here's a closer look at the key features
Fully accessible audio interface
The fully accessible audio interface enables seamless interaction for users through simple voice commands
Alerts help avoid obstacles
Beacon detects obstacles and provides alerts ahead of time to help users avoid collisions
Clear directions along the way
Beacon provides users with clear audio directions along every step of the way
Haptic feedback for safer navigation
Haptic feedback (vibration) provides an additional layer for safety and awareness of surroundings
How Beacon came together
Need-finding
- Desk research
- Observational research*
- Problem definition
Ideation
- Brainstorming
- Concept development
- Competitive analysis
Prototyping
- Beacon hardware design
- Beacon app design
- Interactive prototypes
Testing & Evaluation
- Concept validation*
- Design critique*
- Design iterations
Understanding the challenges users face
We started with a broad question: "How might we use technology to empower blind and low vision individuals?" Research helped us narrow down the problem space.
Desk Research
We found a staggering reality: 80% of our perception relies on sight. For people living with vision loss, moving around can be very difficult, affecting their independence and social life.
Observational study*
We conducted an observational study with six participants navigating while blindfolded. We saw quickly how mobility quickly became a problem for participants.
Key insight: Navigation emerged as the most critical challenge. Narrowing down our scope, the question then became: How might we empower visually impaired users to navigate independently?
Exploring concepts for real-time navigation guidance
Based on our research, we explored three concepts and evaluated them against four key requirements. Smart glasses with a mobile app companion met all four criteria.
| Concept | Voice interaction | Audio guidance | Obstacle detection | Spatial awareness |
|---|---|---|---|---|
| Smart glasses + app | ||||
| Smart cane with sensors | ||||
| Mobile audio app |
What interaction pattern works best for users?
With smart glasses selected, we prototyped to figure out the interaction pattern and form factors that would be best for users. Drawing from the patterns we were already familiar with, we designed the first iteration. But design critique revealed three key issues
First iteration

Glasses with sensors and earbuds
Our initial design of the glasses featured earbuds that extended from the temples. Design critique revealed earbuds blocked ambient noise, which is actually useful for safer navigation
One critical issue emerged with the glasses
The earbuds blocked noise from the surrounding environment

Mobile companion app
The first iteration of the app had a visual onboarding flow, touch controls, and a voice-enabled navigation assistant.
With small touch targets and a text heavy onboarding, the design felt as if it's for sighted users
Unclear entry point for the "Voice Assistant" feature and lack of feedback
Informed by the critique we received, we iterated our designs to address the issues that emerged
Second iteration
Embedded speakers in the glass temples
We replaced the earbuds with embedded speakers in glass temples to ensure safety

Redesigned for a voice-driven interaction model
We also redesigned the interface for a voice-driven interaction pattern and removed all onboarding, immediate access to navigation guidance
Incorporated clear audio navigation guidance, haptic feedback, and spatial alerts
We redesigned for much safer navigation by including clear audio directions, haptic feedback (phone vibrations) to confirm turns and signal obstacles, along with spatial cues to help users understand their surroundings
Real-time navigation guidance for blind and low vision users
Through further design evaluation, we learned clear audio navigation guidance isn't enough. Our final design has audio and haptics feedback working together to enhance safety.
Smart glasses detect obstacles and deliver all direction guidance.
Audio, haptic, and spatial feedback all work together to help users move around safely.
Design principles for assistive navigation
Our design work defined principles for designing assistive navigation that prioritize safety and independence.
Use multiple forms of feedback
Provide audio, spatial, and tactile feedback for more accessibility.
Let users hear surrounding noise
Allow users to hear surrounding noise for additional safety.
Give users advance warning
Alert about obstacles in advance for timely and safe response.
What did I learn from this work?
Working on Beacon taught me design lessons that are still valuable in my work today.
Validate early and often
Each iteration revealed gaps in our assumptions—what seemed obvious to us wasn't always what users needed.
Partner with users from day one
If I were to do this again, I'd partner with actual blind users right from the beginning. Blindfolded proxies can't represent lived daily experiences.