Inspiration

Whenever we always wanted to do more questions to prepare for a test, outside of what the teacher assigned, there weren’t good online resources. A lot of our friends and other students experienced this problem, so we set out to solve it using AI.

What it does

Our App, QuestionCraft allows users to generate multiple choice questions with 4 answers, along with the topic. The model will take all of this data, and give a question similar in style, difficulty and topic to the question provided.

How we built it

We used an HTML/JS/CSS frontend along with SCSS to make the website look nicer and more AI-based. We used Redis to store user information, and the ngrok link where the ML model backend was. We used Vercel to deploy both of these services. On the ML side, we used the LLAMA-213B model from HuggingFace to do our ML processing. We decided to go with kaggle to host our model, and used Tavily to give our model up to date accurate information on the questions it generated.

Challenges we ran into

The first challenge we ran into was the model. We wanted to use the GPT API, but none of us had the free credit, so we decided to use open source models. The next problem we had was the inference time, as the model was very slow, so we decided to use Langchain to speed up inference time using LLM chains. Finally, we encountered the problem that Kaggle didn’t allow SSH access, so hosting our model would be a challenge. We tried a multitude of cloud services, but in the end had to use Kaggle.

Accomplishments that we're proud of

We are proud of being able to create a decent fullstack AI app, complete with a good frontend, backend and ML infrastructure for our app. It took a lot of work to get there, and in the end we’re really happy with being able to experiment with stylings and open source models.

What we learned

We learned about HuggingFace open source models hub and how to access and run models. We also learned about langchain and how to connect it to a HuggingFace pipeline. We also learned how to use prompting templates of the Llama model, along with an accurate search engine, to be able to increase our model’s accuracy. We learned about how cloud services work, even if we didn’t implement it in the final product.

What's next for QuestionCraft

We are planning on fine tuning the Llama model on scraped khan academy questions from their exercises. We also want to utilize cloud computing to have a stable infrastructure with real world usage.

Built With

Share this project:

Updates