Inspiration

We wanted to make a really fun learning and interactive tool, better than a lecture or even a video

What it does

VibeLearn transforms any lesson plan into a voice-guided, interactive learning experience. Students hear natural narration, interact with AI generated hands-on activities, and ask questions via text or voice. Teachers receive comprehension reports showing exactly where students excelled or
struggled.

How we built it

We orchestrated three AI agents using Claude's tool-calling architecture. ElevenLabs delivers
voice narration. Cline generates custom React components. TinyFish enhances the lesson plan with online material.
FastAPI backend, React frontend.

Challenges we ran into

We also had to balance rich interactivity with generation speed to keep the experience interactive Code generation for demos takes a while. Merging all of our tools into one repository was challenging

Accomplishments that we're proud of

We built the workflow of going from one topic and resources to a full lesson plan We figured out how to execute a coding agent from an orchestration agent and display its results

What we learned

Multi-agent orchestration is powerful but requires careful coordination. Tool-calling patterns beat monolithic prompts. Voice-first design dramatically improves accessibility. The right agent for each task outperforms one agent doing everything.

What's next for VibeLearn

Multi-student classrooms, curriculum alignment, progress tracking over time, parent dashboards, and LMS integrations. We want every teacher to have an AI teaching assistant.

Built With

Share this project:

Updates