Inspiration
We were looking for websites that could guide or help victims of flood, but there were none. The recent flood crisis in Gurgaon made us have a look at it and motivated us to make a solution. We have subsections to guide and help each other in a community, guiding helplines and donations are all given in one.
What it does
It has features to predict , guide, rescue and aware the people of India about the likeliness of floods. We have a portion where someone can take help from a volunteer, apply as a volunteer ( which is backed in a database), national helplines, and also predict the likeliness of flood in their state. We have satellite images as well, from where we can see the likeliness too.
The dataset Our goal was to predict floods from weather data using machine learning. For the dataset, we first scraped the website http://floodlist.com/tag/india using the python beautiful-soup 4 library. This website provided us with information about past and current floods India, as well as their date and location. We also performed several data augmentation techniques on this data set, which enabled us to significantly increase the diversity of data available for our training model, without actually collecting new data.
ML model Our machine learning model is based on the python sci-kit learn library. We used pandas to generate a data-frame for the dataset. After experimenting heavily with many models, the Random Forest Classifier gave us the highest accuracy of 98.71% on the test set. We saves our model in a pickle file.
Data Visualization We first obtained a dataset of the major cities and towns in India (around 200 of them) along with their latitude, longitude and population. Next, we plotted the data from the model on various different types of maps, using Plotly chart studio. The maps represent various data such as flood prediction, precipitation analysis, and damage estimates, in the form of scatter plots, heat maps, and bubble plots. The damage estimates were calculated based on flood prediction and population. We also produced geo-referenced satellite images for various cities in India, based on retrieved data from NASA's Global Precipitation Measurement project.
Front-end and hosting Our web app is based on the Flask Python framework. We rendered HTML templates – with CSS for styling and Javascript for added functionality – and integrated them with our machine-learning models and datasets via the Flask back end.
Flask Integration with backend We made a backend(flask) to save volunteers' data in a MySQL File and we made google forms to save the victim's data in an Excel sheet.
Challenges We ran into Our biggest challenge was in mining and collecting data to build our models and data visualizations. Given the extremely limited existing data available for floods and water-related factors in India, scraping quality data was a challenge. We used a combination of weather APIs and scraping techniques to create and compile an accurate and effective dataset. We also struggled with integrating the plots with our web app application, being our first time working with Plotly. Lastly, we faced a lot of git merge conflict issues due to different encodings of CSV files and pickle versions across different computer platforms.
Accomplishments that we're proud of
We are extremely proud of compiling and creating a dataset that can predict the future likeliness of floods. We expanded our machine learning skills by testing out new models and ultimately implementing a model with over 98% accuracy accurately and effectively reflected the current situation of floods in India, as well as allowing us to make future predictions.
We are happy to have integrated various data augmentation, data mining, and data manipulation techniques, together with our model, to create detailed and sophisticated, yet compelling and easy-to-understand plots for data visualization. Nonetheless, the community feature to help out the victims and volunteer for the situation during the flood has also made it a user-friendly website.
Victims during a crisis can check the do's and don'ts when it's likely to be flooded, and the helpline numbers which guide the victims if they are in urgent need of any help.
It also has a camp guiding feature which can be used to locate the nearest refugee camps.
What we learned
First and foremost, we learned a great amount of web scraping and data mining, through our collection of data from websites as well as API's. We also learned techniques to manipulate and augment that data to create more effective datasets that provide us with better visualizations. We also learnt how to integrate flask api with mySql to save volunteer's data.
What's next for 556_FloodCare
- Expanding it for more countries
- making The machine learning prediction model more accurate
- There are barely any APIs available for national or state-wise refugee camp data so we want to build that so that anyone can easily find refugee camps for their state
- The API we are using to predict a state’s flood is old one therefore does not yield accurate results. However, this issue can easily be mitigated by greater access to data.
Built With
- api
- backend
- beautiful-soup
- css
- data
- figma
- flask
- frontend
- html
- javascript
- machine-learning
- scikit-learn
- sql
Log in or sign up for Devpost to join the conversation.