Inspiration

As college students, we constantly switch between mediums - solving problems in handwritten iPad notes, then jumping into AI tools like ChatGPT for guidance. That back-and-forth breaks flow and loses context. With the recent release of Gemini 3.0 Pro and Nano Banana Pro, we wondered: what if your notes themselves could become a virtual, AI-powered study partner?

What it does

Collaboard is an AI study partner that lives inside your handwritten iPad notes. It watches your problem-solving flow and steps in at the right moments - adding the first line of an integral in natural handwriting, sketching a diagram, structuring a table, or nudging you in the right direction with a hint - so you stay in the zone without switching tools or thinking about context.

What makes Collaboard different is its fusion of cutting-edge LLM reasoning with generative image models. Because it can both think and draw, it produces adaptive visuals: diagrams, graphs, tables, labeled sketches, that match your notes in real time. Applying generative imaging to academics expands what’s possible far beyond calculator-style helpers: turning rough sketches into clean circuits, free-body diagrams, proof outlines, flowcharts, and more. The combination makes the creative and analytical space feel nearly limitless. It truly feels magical to use.

How we built it

We used Next.js for the web framework, Supabase for the database, TailwindCSS + shadcn/ui for components, and AI SDK + OpenRouter to access AI models. For the canvas, we chose tldraw for its modularity. We validated key capabilities early, by testing Gemini 3.0 Pro and Nano Banana Pro via OpenRouter, then locked scope and built toward a focused MVP.

Challenges we ran into

After implementing AI “autocomplete,” we added a voice agent so you could converse with your notes - request edits, feedback, and see live updates. That ambition cost time we could have spent on debugging and refining our pitch. Balancing scope against new features was a real challenge.

Accomplishments that we’re proud of

Even if current model costs make consumer rollout tough, the experience feels magical: collaborating with AI directly in your handwritten space. It’s far more intuitive and engaging than bouncing between separate tools and formats.

What we learned

  • Define scope early, and leave real time for debugging.
  • Underestimate timelines. Things always take longer than you think.
  • Filter out flashy features and focus on the core, differentiating value.

What’s next for Collaboard

We’ll continue refining, prototyping, and exploring cost-efficient pathways. The interaction model shows real promise for students and anyone who learns or thinks in a visually format. We believe we’ve tapped into something really special, and we’re excited for people to try it.

Built With

Share this project:

Updates