Inspiration

We wanted to make chatbots feel more interactive when learning. The idea was to combine chat explanations with live visual state changes so users can see algorithms and data structures evolve step by step.

What it does

This repo is a chat DSA visualizer. It also includes a Learning Mode that generates questions and feedback based on the user query.

How we built it

We built it with Next.js.

  • Frontend chat sends prompts to API routes for explanation, routing, and visualization.
  • Routing picks which rendering pipeline should be used.
  • pipeline normalizes input, calls Watsonx/OpenRouter, validates output with strict Zod schemas, repairs invalid specs, then renders via a component registry.
  • DSA pipeline returns a dsaupdate payload that drives the UI state.

Challenges we ran into

  • Getting LLM output to stay in strict JSON format every time.
  • Mapping out the different key primatives to represent multiple diagrams
  • Enforcing valid recursive traces (depth changes, call lifecycle, partition/merge requirements).
  • Correctly routing ambiguous user prompts between rendering flows.

Accomplishments that we're proud of

  • Validation + repair loop for generated specs.

What we learned

  • LLMs responses aren't deterministic.
  • Separating concerns (routing, explanation, visualization spec generation) makes iteration much faster.

Built With

Share this project:

Updates