Inspiration
Prior to the hackathon, our team hosted a Halloween movie night, where we stumbled upon a compelling concept for our project. We were watching the Netflix movie #Alive, where the main character, trapped in his apartment during a zombie apocalypse, uses social media to post about his situation, hoping to attract rescue. This sparked an idea: what if we adapted this concept to a real-world application? Put simply, we envisioned an app that would facilitate real-time crowd-sourced incident tracking for natural disaster scenarios. By enabling people in danger to report their locations and situations, our app aims to create an accessible communication channel for those in distress, even in challenging conditions. This could streamline rescue efforts, foster intercommunity cooperation, and ultimately achieve the most salient impact that any climate-focused application could have: saving lives.
What it does
Our application empowers users to document their surroundings during natural disasters (in the Florida area for now) by capturing a photo and entering their address. This input is then processed using IBM’s Watson AI to assess the severity of the threat on a scale from 1 to 100. Each user-generated report is automatically plotted on a color-coded map, with markers reflecting the intensity of the threat. This visual representation creates a clear, real-time view of the most affected areas. Each image and threat score is also posted to a continuously updated thread, enabling both first responders and community members to monitor escalating danger zones with immediate situational awareness.
In addition, we added a text bot that allows users to reply to others' situations and foster a sense of community. That said, it is not our main feature, but an auxiliary feature that is an addendum to the image thread that we already have.
By prioritizing image-based reports over traditional text entries, our application provides users with a more intuitive and impactful experience. This choice is not just about practicality; it's about creating an immersive and compelling way to communicate the reality of these events. Viewing an image of a disaster-stricken area conveys a depth of understanding and urgency that words alone often cannot. Images show the scale of the destruction, the severity of the crisis, and the human impact behind each report, making each post resonate more strongly with both the responders and the public.
The app’s integration of real images achieves two main objectives: it mobilizes rescue efforts by providing first responders with a detailed view of the areas requiring the most immediate action, and it raises broader awareness by showcasing the true impact of these disasters to those outside the affected areas. By bridging the gap between data and lived experience, our platform offers more than just information; it shares stories, fosters empathy, and drives a collective response. Users aren’t simply seeing numbers and statistics; they’re witnessing the urgent realities of natural disasters and becoming part of a community prepared to act. This approach makes our application not only a tool for survival but also a powerful means of fostering awareness and inspiring action.
How we built it
Our team, comprising several computer science students and a data science student with a focus on human resources, collaborated to design an interdisciplinary application informed by data-driven insights on marginalized communities frequently affected by natural disasters in Florida. Using datasets analyzed in Jupyter Notebooks, we conducted a comprehensive assessment of these communities to understand their unique vulnerabilities. This background research shaped our development approach, ensuring our application prioritized support for those most in need.
To build the generative image-processing model, we leveraged IBM Watson’s AI to assess and categorize threat levels based on user-submitted images. For data management, we integrated Supabase to securely store address and user location information. We used a variety of Python libraries, including OpenCV for image processing, GeoPandas for geospatial analysis, and Seaborn for data visualization, to develop interactive mapping features. These tools collectively enhanced our app’s functionality, creating an efficient system for real-time situational awareness and targeted first-response support for vulnerable communities.
Challenges we ran into
We ran into multiple challenges during the process of of building our application. Firstly, Streamlit's limited frontend functionality made it difficult to customize our application in the way we wanted it to. Secondly, learning how to use IBM Watson and Jupyter Notebooks was a novelty for us, and something that we had to get used to to build a functional AI model. Lastly, just thinking of ways we could incorporate our idea while integrating all the necessary technologies served as a great challenge.
Accomplishments that we're proud of
We’re especially proud of the dedication and teamwork we put into making this app functional. Despite the challenges we faced, we remained committed to our vision, knowing this application could be truly impactful for people affected by natural disasters. We were determined to see it through, even when things got difficult, because we believed in its potential not just as a concept, but as a practical tool that could aid in users' survival.
What we learned
We learned the basics of generative AI and web development while also gaining awareness on natural disasters in the United States and their response efforts.
What's next for #Alive
The next phase for the #Alive application is to develop and deploy a mobile version, making it accessible for users directly on their smartphones. This mobile deployment would streamline the reporting process, allowing individuals to quickly capture images and share real-time updates with just a few taps, regardless of their location. By integrating mobile features such as GPS tracking and push notifications, we aim to further enhance the responsiveness and reach of the app, enabling quicker response times and better situational awareness for both users in distress and first responders.
Built With
- geopandas
- ibm-watson
- jupyter
- opencv
- python
- scikit
- seaborn
- streamlit


Log in or sign up for Devpost to join the conversation.