Inspiration
In the United States, 27 million people are uninsured.
In early 2026, federal Medicaid rollbacks left 4 million more people without coverage overnight.
But the problem isn’t that care doesn’t exist.
There are over 14,000 federally qualified health centers (FQHCs) across the country. They are funded to treat patients regardless of ability to pay.
Most people who need them don’t know they exist.
If you’re uninsured and sick, you’re expected to:
- know what an FQHC is
- search for clinics online
- compare services, hours, and costs
- and figure it out yourself
All while dealing with pain, stress, or language barriers.
This isn’t a supply problem.
It’s an access problem.
Call2Well was built to close that gap.
What it does
Call2Well lets anyone call a phone number, describe their situation in plain language, and get connected to the nearest free or low-cost clinic in under 90 seconds.
No app.
No forms.
No insurance required.
Using Claude by Anthropic, the system:
- identifies urgency
- checks eligibility for FQHCs and Medicaid
- selects the most relevant clinic
- explains the recommendation in the caller’s language
If symptoms sound life-threatening, it immediately tells the caller to call 911.
Twilio handles the call itself, including real-time speech-to-text and text-to-speech.
How we built it
- Twilio ConversationRelay for real-time voice input/output over WebSocket
- Claude (Anthropic) for multi-turn conversation, multilingual support, and decision-making
- HRSA data for verified clinic locations and services
- Supabase (PostgreSQL) to store and rank clinics by distance, services, language, and cost
- FastAPI to manage orchestration and call state
- Next.js dashboard to visualize the pipeline during demos
Challenges we ran into
The main challenge was getting real-time voice interaction and AI reasoning to work reliably together.
We had to:
- maintain conversation state across WebSocket messages
- keep latency low while coordinating multiple systems
- constrain the model to avoid unsafe outputs (e.g., diagnoses, guarantees, missed emergencies)
Accomplishments that we're proud of
- Built a fully voice-based system with no UI dependency
- Delivered real-time multilingual support without separate flows
- Connected users to actual clinics using verified government data
- Created a system that explains its recommendations, not just returns results
What we learned
Voice is often a better interface than a UI in high-stress situations.
It removes barriers like typing, navigation, and language-specific flows.
We also learned that combining AI with real infrastructure (telephony + public datasets) makes it significantly more useful than a standalone model.
What's next for Call2Well
- Expand coverage to all 14,000+ FQHCs nationwide
- Add appointment scheduling through the call
- Send follow-up SMS reminders
- Improve ranking and personalization over time
Built With
- claude
- fastapi
- next.js
- ngrok
- python
- react-ai-claude-api-(anthropic)-?-claude-sonnet-4-6-phone-&-sms-twilio-conversationrelay
- supabase
- twilio
- typescript
Log in or sign up for Devpost to join the conversation.