A React Native app for expressive Augmentative & Alternative Communication (AAC). It combines tappable word tiles, AI-powered suggestions, emotion-aware text-to-speech, a drawable board, and speech-to-tiles transcription so non-verbal users can build and speak sentences visually, enabling new ways for the voiceless to communicate.
- Speak: Build sentences with tiles, get AI next-word suggestions, and play back with ElevenLabs TTS (and changes emotion, too). Camera-based emotion detection can influence tone.
- Board: Free-form canvas to draw, place tiles/photos, and send the board to an LLM for interpretation suggestions.
- Transcribe: Live speech recognition (web) that turns transcripts into tile sequences, simplifies replies, and suggests responses.
- Symbols: Automatically pulls pictograms from OpenSymbols with caching.
- Expo SDK, Expo Router, React Native
- ElevenLabs for TTS
- Gemini for suggestions and interpretation
- Node 18+ and npm
- Expo CLI (
npm i -g expo), or usenpx expo - API keys (see below)
Create a .env.local (used by Expo web builds) or fill expo.extra.apiKeys in app.json. Required keys:
EXPO_PUBLIC_OPENROUTER_API_KEY=sk-or-...
EXPO_PUBLIC_ELEVENLABS_API_KEY=...
EXPO_PUBLIC_OPENSYMBOLS_SECRET=token::...
For mobile builds via EAS, also set the same keys under expo.extra.apiKeys or configure them as secret env vars in EAS/Vercel.
npm install
npx expo start