Inspiration

Most of the issues related to AI can be solved with providing a better prompt. Normal users can't really do that so we built a service that would allow them to prompt like an engineer.

What it does

MetaPrompt will take your prompt and make it better for a LLM to run to give you a reply that is closer to what you want.

How we built it

We start with a post request to get the prompt from the user.

We take the prompt and extract the context topic and subtopic and fed it to the respective LLM.

Finally, we send the response to the user as a stream.

Challenges we ran into

Our original idea was to create a service that would choose the best LLM for the prompt. But due to relevant data and benchmarking statistics available to the public it was unfeasible.

So we decided to go with our current idea but we still faced some challenges as different LLMs seems to work differently with the same prompt.

It took a lot of trial and error to be able to create a somewhat working product.

Accomplishments that we're proud of

We were able to research and figure out the direction we need to go to be able to solve our first issue. Some of our team members were first time hackathon participants and to be able to come up with an idea and build something is what we are happy about.

What we learned

We definitely were not familiar with all the AI perks. We did a lot of research and learnt a lot about different LLMs, what they are good at and what they are bad at.

What's next for MetaPrompt

We will keep working on the app and pursue the first idea.

Built With

Share this project:

Updates