Inspiration
Blind and visually impaired individuals often rely on tools that describe the world moment by moment, but few help them build a lasting mental map of their surroundings. We wanted to go beyond object recognition and create something that gives users spatial awareness and memory, not just descriptions. Blindsense was inspired by the idea that independence comes not just from knowing what is in front of you, but from understanding and remembering the space around you.
What it does
Blindsense is a wearable AI assistant that detects objects around a user, estimates their distance, and communicates that information through real-time text-to-speech. The app can also track an object’s location and then guide the user back to it.
How we built it
We built Blindsense as a mobile app using Expo (React Native) for rapid development and cross-platform flexibility.
The system integrates:
- Real-time object detection using computer vision
- Distance estimation from camera input
- Text-to-speech for audio feedback
- A lightweight spatial anchoring system to store object locations
The phone is mounted in a head-aligned VR-style headset to maintain consistent spatial orientation and hands-free operation.
Challenges we ran into
- Designing a no-touch activation system that works in loud environments
- Working within iOS hardware button limitations
- Balancing real-time detection with performance constraints
- Making audio feedback informative without overwhelming the user
- Handling spatial anchoring in a way that feels intuitive and reliable
Accomplishments that we're proud of
- Creating a working real-time object + distance detection pipeline
- Implementing voice-driven spatial memory anchors
- Designing a hands-free wearable experience
- Reframing assistive vision as spatial intelligence rather than simple object labeling
What we learned
- Accessibility design requires rethinking interaction from the ground up
- Spatial awareness is as important as object recognition
- Hardware constraints (like iOS limitations) shape product design more than expected
What's next for Blindsense
- Improve spatial accuracy and persistence of anchors
- Develop better non-visual spatial feedback (e.g., directional audio)
- Allowing users to query the app with specific questions
- Explore integration with dedicated wearable hardware
- Conduct user testing with blind and visually impaired individuals
- Optimize performance for continuous, low-latency operation
Log in or sign up for Devpost to join the conversation.