## Handphabet
Inspiration
ASL is beautiful. But learning it shouldn't be a barrier.
The deaf and hearing communities are separated by something that technology should have already solved. Hearing people don't have real ways to learn ASL. The few options are expensive and boring. There's no intuitive or fun way to practice.
HandAlphabet was built to solve this. Something that makes learning ASL accessible, engaging, and genuinely fun—like learning your ABCs.
What it does
HandAlphabet is an interactive ASL learning app on Apple Vision Pro.
See 3D hand models for each ASL letter. Interact with them in real time. Learn like you're learning your ABCs.
Key features:
- Real-time hand interaction - See your hands and 3D models together
- Beautiful 3D models - 26 hand positions sculpted in Blender
- Interactive exploration - Rotate and explore each letter
- Hand position visibility - Know when your hands are in frame
- Instant visual feedback - Learn at your own pace
How we built it
Technologies:
- Swift - App development
- SwiftUI - User interface
- RealityKit - 3D rendering on Vision Pro
- Blender - 3D hand model creation and optimization
- Python - Automated code generation
The process:
- Created 26 unique 3D hand models in Blender matching real ASL positions
- Optimized models for smooth Vision Pro performance
- Built SwiftUI interface with interactive 3D model viewers
- Used Python scripts to auto-generate all 26 letter cells and 6 row layouts
- Implemented real-time hand interaction and visualization
Challenges we ran into
3D model optimization - Getting 26 hand models to run smoothly on Vision Pro required careful decimation and USDZ export optimization.
UI responsiveness - Building smooth interactions with 3D models required proper state management and async operations.
Code generation complexity - Creating Python scripts to auto-generate 32 Swift files (26 cells + 6 rows) with proper naming and references.
Vision Pro constraints - Understanding Vision Pro's rendering pipeline and optimizing for immersive experiences.
Accomplishments that we're proud of
- Successfully built a smooth 3D hand model viewer
- Created intuitive UI with beautiful visual interactions
- Optimized 26 models for Vision Pro performance
- Auto-generated all Swift components with Python
- Made learning ASL accessible and engaging
What we learned
Blender optimization - 3D model optimization is critical for smooth AR/VR performance.
Code automation - Python scripts can dramatically speed up development and reduce repetitive work.
Vision Pro UX - Immersive experiences need beautiful visuals and intuitive interactions to engage users.
Human-centered design - Building for accessibility and inclusion requires understanding the real needs of communities.
What's next for HandAlphabet
- Full ASL vocabulary - Expand beyond letters to words and phrases
- Interactive lessons - Structured learning paths for progression
- Community features - Share progress and learn with others
- Multiplayer mode - Practice conversations with friends
- Accessibility - Add haptic feedback and audio guidance
- Multiple sign languages - Support for other sign languages worldwide


Log in or sign up for Devpost to join the conversation.