Inspiration

Imagine a community struck by a devastating hurricane. Communication lines are down, power is out, and families are uncertain where to turn for help. This is the reality for countless people each year who find themselves in the path of natural disasters.

To bridge this gap, we developed AidLink: an AI-driven call service that automatically gathers crucial information from individuals in disaster-stricken areas. With a simple phone call, anyone can connect with AidLink. No app downloads, no internet required—just a phone call. AidLink will guide the caller through a conversation, collecting their name, current location, and the specific resources they need, whether it’s medical assistance, food, or temporary shelter

What it does

AidLink allows first responders and other community members to see nearby requests for resources.

AidLink allows you to submit a request in two ways:

  1. You can call our number (646 755 9391) and speak to our AI agent Alyssa (powered by IBM Watson Assistant), who will collect information from you like your name, general location, and situation. Alyssa then automatically pinpoints your location and analyzes its severity
  2. You can create an account on the app and submit a request, where you specify the type of resources you need, your location, and the severity of your situation.

Alyssa is able to pinpoint your exact location just from a single phrase like "I'm at Honors Village." This is done with the help of the Google Maps API.

After creating an account, you can see all the requests. Feel free to sign up here!

How we built it

We used Nextjs to build the frontend React components. In addition, we used Nextjs to build an API route that served as a webhook for Alyssa to send caller information to. We used a PostgreSQL database to store all user information.

Watson AI Assistant was used to power Alyssa, and Watson AI models such as the Granite instruct model were used to determine the severity of situations.

Challenges we ran into

Initially, we wanted to integrate Watson Assistant with the Twilio API to enable both voice call and SMS communication with Alyssa. However, Twilio required us to verify the number, which would have taken 3-5 business days. Luckily, IBM's Watson Assistant Plus trial allowed us to have our own phone number out of the box without any verification. This worked very well for us to get Alyssa up and running quickly.

We also ran into challenges with our AI analyzing user information. However, our team member in the Statistics department, who is majoring in Data Science, was able to better train the AI Watson model to better respond to users.

Finally, we really wanted to use a custom model in order to pinpoint your location from simple phrases like "I'm at Library West." However, we were unable to find an adequate dataset of geocoded buildings in Gainesville. Therefore, we decided to use the Google Places API which worked really well for us!

Accomplishments that we're proud of

We're proud that we were able to build this app and effectively use IBM's Watson AI to solve a real problem. We're also proud that we were able to get an accurate phone call AI chatbot up and running for our app.

What we learned

We learned a lot about using IBM Watson AI to solve real problems. We also learned how to effectively make use of AI models, whether it be text-to-speech or instruct models such as IBM Watson AI Granite.

What's next for AidLink

We hope to continuously add more features that will help AidLink provide more value and help more people.

Share this project:

Updates