Inspiration
With AI's rapid growth, the use of AI tools like ChatGPT and Gemini has been increasing exponentially. As a result, the environmental toll of large language models (LLMs) is becoming more apparent. Image uploads to LLMs consume many tokens, leading to higher energy use and increased carbon emissions, which, over time, are detrimental to the environment.
EcoTokens is our solution to this problem. Our Google Chrome extension helps cut down on the usage of tokens, leading to lower emissions levels and energy savings!
What it does
After installing EcoTokens, it automatically detects which AI tool you are using, such as ChatGPT or Gemini AI, and identifies when an image is being uploaded by the user. Before the user presses "submit," EcoTokens extracts the image and compresses it by up to 90% of the original size, allowing the AI tool to process the compressed version. Afterward, when the user clicks on the icon, a report is displayed showing how many tokens have been saved, including the amount of carbon emissions reduced.
How we built it
After using fourier transformations on javascript to compress images and files, we used Node.js and React to build out a chrome extension and its front-end. We understood the importance of using this extension at no time/effort-cost or inconvenience to the user, and thus used a chrome extension that captures an upload-event while you’re on a popular LLM website, automatically takes the image as an input and returns the modified image as an output. Our extension compresses the image to the point where its compression has no impact on the LLM’s performance.
Challenges we ran into
Originally, our script was in Python. However, using this python script on our react app was creating delayed results and faulty compression. We converted all of this code manually into javascript, without having libraries like Numpy to help us.
Accomplishments that we're proud of
We’re proud of being able to find an opportunity to decarbonise in something that’s becoming rapidly popular. As AI continues to take over the world and become synonymous with our workflow and automation pipelines, it’s important to understand and introspect upon its environmental impact. Not only does our extension make a difference, user by user, and prompt by prompt, but it raises awareness for the minute digital activities we partake in that have a lasting impact on our carbon footprints.
What we learned
We learnt how to find optimisation in the smallest of openings. Many of our group members did not know react or how to make a chrome extension, and that was a huge learning curve within a 24-36 hour time period. We’re genuinely proud of the technologies we’ve learnt.
Aside from the technical experience, we learnt a lot about how LLMs work and their energy consumption. Scouring through academic papers to establish a genuinely strong use-case also helped us understand the mathematics of energy, carbon and LLM tokens.
What's next for EcoTokens
We are looking to branch out into numerous file-type compressions as well as prompt compressions/suggestions. This further reduces the number of tokens we utilize, and thus our carbon footprint. Small steps make a big difference.


Log in or sign up for Devpost to join the conversation.