AI-powered accessibility app for blind and visually impaired users. Navigate environments, read text, and get directionsβall hands-free.
Built at Hack Harvard 2025 π
Aria transforms your iPhone camera into an intelligent assistant with three gesture-controlled modes:
- β Environment Mode - Detects obstacles, describes surroundings, guides safe paths
- βοΈ Communication Mode - Reads text from signs, menus, labels, documents
- πΊοΈ Navigation Mode - Turn-by-turn walking directions with voice guidance
All controlled hands-free with simple gestures. Show a fist (β) to stop.
- Gemini 2.0 Flash - Scene understanding & OCR
- ElevenLabs - Natural text-to-speech
- Google Maps API - Turn-by-turn navigation
- iOS Vision - Hand gesture recognition
- Swift/SwiftUI - Native iOS app
- Clone the repo
git clone https://github.com/yourusername/aria.git-
Get API keys from:
- Google AI Studio (Gemini)
- ElevenLabs (TTS)
- Google Cloud Console (Maps - enable Directions API)
-
Add keys to
Utilities/Constants.swift:
static let geminiAPIKey = "YOUR_KEY"
static let elevenLabsAPIKey = "YOUR_KEY"
static let googleMapsAPIKey = "YOUR_KEY"- Run on iPhone (requires physical device for camera/GPS)
Environment Mode: Show open palm β β Move camera around β Hear obstacle descriptions
Reading Mode: Show peace sign βοΈ β Point at text β Hear it read aloud
Navigation: Tap Navigate button β Enter destination β Follow voice directions
Stop anytime: Show fist β