Skip to content

xafn/communico

Repository files navigation

Communico

A React Native app for expressive Augmentative & Alternative Communication (AAC). It combines tappable word tiles, AI-powered suggestions, emotion-aware text-to-speech, a drawable board, and speech-to-tiles transcription so non-verbal users can build and speak sentences visually, enabling new ways for the voiceless to communicate.

What it does

  • Speak: Build sentences with tiles, get AI next-word suggestions, and play back with ElevenLabs TTS (and changes emotion, too). Camera-based emotion detection can influence tone.
  • Board: Free-form canvas to draw, place tiles/photos, and send the board to an LLM for interpretation suggestions.
  • Transcribe: Live speech recognition (web) that turns transcripts into tile sequences, simplifies replies, and suggests responses.
  • Symbols: Automatically pulls pictograms from OpenSymbols with caching.

Tech stack

  • Expo SDK, Expo Router, React Native
  • ElevenLabs for TTS
  • Gemini for suggestions and interpretation

Prerequisites

  • Node 18+ and npm
  • Expo CLI (npm i -g expo), or use npx expo
  • API keys (see below)

Environment

Create a .env.local (used by Expo web builds) or fill expo.extra.apiKeys in app.json. Required keys:

EXPO_PUBLIC_OPENROUTER_API_KEY=sk-or-...
EXPO_PUBLIC_ELEVENLABS_API_KEY=...
EXPO_PUBLIC_OPENSYMBOLS_SECRET=token::...

For mobile builds via EAS, also set the same keys under expo.extra.apiKeys or configure them as secret env vars in EAS/Vercel.

Run locally

npm install
npx expo start

About

🗣️ AI-powered communication app for non-verbal users

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors