Inspiration

A big issue with our current systems to stay informed about natural disasters is that it puts heavy emphasis on major ones, like nationwide power outages, severe hurricanes, and strong earthquakes, and not enough on smaller, localized disasters like bomb cyclones, mudslides, and avalanches if they take place outside someone’s local sphere.

DisastEarth shows global disasters, no matter how big or small, which keeps both locals in disaster-prone areas and disaster relief organizations like Red Cross and FEMA up-to-date.

What it does

DisastEarth can be split into 3 core features with different functionalities: Live Globe: The globe, as well as the text boxes, are all synced to the DynamoDB database live and display the most recent events. The globe has points all over based on estimated geolocation data, and each point is interactable with more details like the type of disaster, and estimated people affected

Sentiment Based Needs: AWS Bedrock is the core of the Sentiment Analysis, and is used to extract core details and analyze articles for keywords about the disaster as well as identify the needs and items of support that the place needs.

Statistics and News: We scraped historical news articles from GNews.io, analyzing them and extracting important details such as the title, summary, and url. We then extrapolated key values such as how much help the area needed, what kind of help was needed, and how confident the LLM was.

How we built it

React(Next.js) TailwindCSS Python Amazon SageMaker Amazon Bedrock Amazon API Gateway AWS Lambda Amazon DynamoDB Gemini API

Challenges we ran into

Working with a really large image dataset since it took ages to download, verify, reupload, train, and then infer from the model. Getting SageMaker to work was also huge issue, and we kept running into quota issues. We had to pivot from X posts to news apis due to the horrible rate limits from the X API

Accomplishments that we're proud of

Working with half the expected team and still getting most of our plan done! Budgeting our time well!

What we learned

Technical wise, we learnt a lot about IAM and how to manage different policies across services. With respect to soft skills, we learnt more about patience when dealing with circumstances that did not go our way.

What's next for DisastEarth

Get Live Data, not only work with historical. Get a live rendering of the views on the globe. Optimize prompts and add more constraints and guardrails to prevent hallucinations as much as possible. Get Sagemaker to work in order to optimize hyperparameters. Optimize frontend to follow coding conventions and style guidelines. Create scripts that automatically run web scrapers daily on predetermined news sites, looking for disasters. Add filters for disasters (size, type, people affected, location, etc.).

Built With

Share this project:

Updates