Inspiration

Every second counts in emergencies, but 911 call centers are often overwhelmed. We wanted to see if AI could act as an assistant, handling the first layer of communication, collecting critical details, and displaying them clearly to human operators. LifeLine was built to prove that AI can save time, save lives, and support first responders, not replace them.

What it does

LifeLine is an AI-powered 911 voice agent that answers emergency calls, asks key triage questions (“Where are you?”, “What’s happening?”, “How many people are hurt?”), and automatically summarizes the information on a real-time dashboard. It converts speech to text, analyzes it with an Amazon Bedrock (Claude 3 Haiku) model, and sends structured summaries (location, severity, people involved) to an operator dashboard.

How we built it

We built a FastAPI backend that connects to AWS Bedrock for natural language analysis. For the voice interface, we integrated Twilio to simulate emergency calls. Callers interact with a lifelike AI voice powered by Twilio Voice.

The frontend dashboard was created using HTML/CSS and Javascript

Challenges we ran into

  • First time using AWS Bedrock, so this entire experience was extremely new to me
  • Integrating Twilio webhooks with FastAPI to handle live calls.
  • Designing a dashboard that updates in real time while keeping the UI simple and readable.

Accomplishments that we're proud of

Overall, I am just proud of the fact that I was able to create a product that could be used for social good.

What we learned

How to integrate voice systems (Twilio) with modern LLM APIs.

What's next for LifeLine

I will continue to work on this project after the hackathon in order to make it more polished and have a better calling experience.

Built With

Share this project:

Updates