Inspiration

Something that inspired me was seeing a video by CNBC titled "How The Massive Power Draw Of Generative AI Is Overtaxing Our Grid." This video helped me understand the massive impact that data centers can have on our current infrastructure and the large-scale implications on our environment.

What it does

The ML model analyzes simulated training data to identify patterns in the data based on weekly and yearly cycles to predict the number of users who are predicted to log on go the internet at a given time so that data centers can switch off extra servers that normally would be idle and turn them back on when demand is expected to rise. It will also allow companies to schedule maintenance times for their servers during periods of low demand so that user experience doesn't suffer.

How we built it

I built the project in Google Collab using the python and python libraries such as Prophet (for data analysis), Pandas (to store data), and Matplotlib for data visualization.

Challenges we ran into

Lack of real-life training data: training data for number of users using a particular service is not available online. Therefore, I had Gemini generate me a python script to generate simulated data to train the model. Lack of Machine Learning experience: I lacked significant knowledge of machine learning principles before this Hackathon. However, YouTube videos and Gemini helped me learn the concepts of machine learning and learn more about the libraries in python for ML.

Accomplishments that we're proud of

I was able to build an ML model that is able to predict with high uncertainty the number of users that login at a given time.

What we learned

Time Series Models Prophet, Pandas, and Matplotlib Library How to smooth data using rolling mean

What's next for Sleepify

Build a better user-interface Build functionality to allow users to predict larger periods in the future

Built With

Share this project:

Updates