Inspiration

We wanted to save money and water by reducing token usage in AI agents

What it does

Reduces token usage for AI prompts by simplifying language

How we built it

We used Tregex and Tsurgeon (Stanford NLP API) algorithms for simplification and Regex matching for compression. Then, we use JTokkit (Tokenizer for Java) to quantify the tokens.

Challenges we ran into

Choosing the tech stack and APIs were hard. As well as figuring out how to clean the strings because we do not have linguistics background

Accomplishments that we're proud of

We managed to get a front end working!

What we learned

We learned alot about workflow and utilizing API / AI agents

What's next for AI-Prompt-Token-Investigator

Feeding the prompts to AI and seeing the tokens it spits out back!

Built With

Share this project:

Updates