Inspiration
My teammate and I were discussing environmental issues we could realistically address in today’s world, and the idea of AI usage came up. Since AI tools are becoming part of everyday life, we started thinking about their hidden environmental cost. We learned that longer prompts require more computation, which increases energy use and water consumption in data centers. That led us to build a tool that encourages more efficient AI use without changing how people work.
What it does
GreenPrompts is a browser extension that shortens overly verbose AI prompts before they’re sent. It removes unnecessary wording while keeping the original intent the same, reducing token usage. The extension also shows users an estimate of the tokens, energy, and water saved to make AI’s environmental impact more visible.
How we built it
We built GreenPrompts as a Chrome browser extension using JavaScript, HTML, and CSS. A content script runs on the page to detect the currently active text field and applies rule-based prompt compression locally before the prompt is submitted. Token usage is estimated based on prompt length, and the extension converts token savings into approximate energy and water savings, which are shown in the pop-up. All processing happens on the user’s device, with settings saved using Chrome’s extension storage.
Challenges we ran into
One challenge was shortening prompts without changing their meaning or removing important constraints. We also had to be careful not to oversell environmental estimates, since exact energy and water usage vary widely between data centers.
Accomplishments that we're proud of
We’re proud that GreenPrompts works as a real, usable tool rather than just a concept. It integrates directly into existing workflows, runs locally, and turns an invisible environmental cost into something users can actually see. We’re also proud of how accessible and easy it is to use.
What we learned
We learned more about the environmental footprint of AI systems and how small design choices can add up at scale. On the technical side, we learned how to build a Chrome extension from scratch, including working with content scripts, passing messages between components, estimating tokens, and writing rule-based logic to shorten prompts while keeping their original meaning.
What's next for GreenPrompts
Next, we want to make GreenPrompts even more proactive by automatically detecting and removing unnecessary filler messages like “thank you,” “okay,” or other non-informative prompts before they’re sent. The goal is to gently discourage sending prompts that don’t add value, helping users avoid unnecessary token usage altogether. We also want to continue improving prompt compression and give users more control over how aggressive it is.
Log in or sign up for Devpost to join the conversation.