ClassPulse

Summary

In most classrooms, struggle is invisible.

A student can spend 20 minutes stuck on a problem, quietly fall behind, and never raise their hand. By the time a teacher notices, the moment to help has already passed.

We built ClassPulse to change that.

ClassPulse is a real-time classroom intelligence system that helps teachers see who is stuck, what they are stuck on, and when intervention will matter most. Students work naturally on their iPads, while our system continuously analyzes their progress, detects confusion or off-track work, and surfaces actionable alerts to the teacher. Instead of forcing teachers to stare at a dashboard, ClassPulse can deliver subtle notifications through smart glasses, keeping them mobile, attentive, and focused on students—not screens.

And when a teacher is busy helping someone else, ClassPulse can generate AI-powered explanation videos to support students in the meantime.

The result is a classroom where confusion is no longer silent, support is more immediate, and teachers can act before students fall behind.


Inspiration

We started with a simple classroom reality: teachers cannot help with what they cannot see.

In a typical class, some students ask for help right away, but many don’t. They hesitate, second-guess themselves, or try to push through confusion alone. Meanwhile, teachers are constantly scanning the room, trying to figure out who needs support most urgently.

That felt like a broken feedback loop.

We wanted to build a system that makes student thinking visible in real time—without interrupting class, without replacing the teacher, and without reducing learning to just right-or-wrong answers.

Our vision was to give teachers the kind of live situational awareness they rarely have: not just who finished, but who is confused, who is drifting off track, and where a well-timed intervention could unlock understanding for multiple students at once.

What it does

ClassPulse turns a classroom into a live feedback system.

For students

  • Students solve problems on their iPads as they normally would.
  • They can highlight steps they do not understand to actively request help.
  • If the teacher is currently busy, they can receive an AI-generated explanation video that walks them through the concept step by step.

For teachers

  • A live dashboard shows the state of the classroom in real time.
  • The system flags students who appear stuck, uncertain, or off track.
  • Teachers can identify patterns across the room and step in before confusion compounds.
  • When several students struggle with the same concept, the teacher can pause and address the misunderstanding immediately.

AR/smart-glasses support

One of the most exciting parts of the project is ambient teacher awareness through smart glasses.

Instead of repeatedly checking a laptop or dashboard, teachers can receive subtle glanceable alerts while walking around the classroom. That means they stay engaged with students and still know who needs attention next.

This turns classroom analytics from a "go look at the screen" workflow into a "see what matters right now" workflow.

How we built it

We built ClassPulse as a multi-surface system with classroom analysis, teacher-facing insight, and on-demand student support.

Core system architecture

  • Student and teacher interfaces: built with React
  • Real-time analysis pipeline: continuous screenshot capture plus model inference using OpenAI and Gemini to interpret student work in progress
  • Teacher dashboard: live classroom state, student progress tracking, and alert surfacing
  • AR integration: smart-glasses notifications that let teachers receive alerts without breaking classroom flow
  • AI explainer generation: Remotion + ElevenLabs to create personalized support videos when students need immediate help

🧠 Local Model Support (MLX + Qwen)

ClassPulse is designed to support both cloud-based and fully local inference pipelines.

This demo showcases our local-first approach using:

  • MLX for efficient on-device inference (Apple Silicon)
  • Qwen 3.5 0.8B (MLX-8bit) as a lightweight vision-language model
  • A streaming architecture for real-time feedback

Why local models?

In a classroom setting, local inference unlocks:

  • Low latency → faster feedback for students
  • Privacy → student data stays on-device or within school infrastructure
  • Reliability → no dependency on internet connectivity
  • Scalability → classrooms can run independently without API bottlenecks

What this demo proves

  • Real-time frame analysis using a local VLM
  • Streaming token responses from mlx_vlm.server
  • Practical performance with a small, efficient model (0.8B)

This is the same architecture direction we use in ClassPulse, where lightweight local models can handle immediate understanding tasks, while larger models (local or cloud) can be used for deeper reasoning and explanation generation.

Model Configuration

By default, this demo can run: Qwen3.5-0.8B-MLX-8bit

This model was chosen because it offers a strong balance of:

  • speed (suitable for real-time interaction)
  • capability (vision + language understanding)
  • efficiency (runs locally on Apple Silicon devices)

This demonstrates that ClassPulse is not dependent on large cloud models—it can run intelligent, real-time classroom feedback locally.

Why this architecture matters

We were very intentional about keeping the teacher at the center.

This is not a system designed to replace teacher judgment. It is designed to augment it.

Our analysis pipeline identifies moments that might need attention, but the teacher remains the one who decides when to intervene, whom to help first, and whether an issue is individual or class-wide. The AI video support is there to reduce waiting friction—not to remove human instruction from the learning process.

That design principle shaped nearly every technical decision we made.

Challenges we ran into

1. Interpreting messy student work in real time

This was one of the hardest parts.

It is much easier to evaluate a final answer than to understand a student’s in-progress reasoning. Real student work is messy, incomplete, and often ambiguous. We had to think beyond grading and focus on detecting signals of confusion, hesitation, and off-track progress.

2. Balancing AI support with teacher presence

We never wanted the product to feel like "AI teaches while the teacher supervises." That is not the goal.

The real challenge was designing AI as a support layer that helps teachers stay more present and more effective—not less necessary.

3. Delivering insights without creating another screen problem

A dashboard is useful, but in a live classroom, constantly checking it can pull a teacher out of the room. That is why the glasses workflow mattered so much. Building for glanceability and low-friction awareness forced us to rethink how alerts should be prioritized and presented.

4. Turning support into something immediate

If a student is stuck now, help in ten minutes is often too late. We wanted support to be available in the moment, which meant combining live analysis with fast generation of explanation content that was clear, relevant, and actually helpful.

Accomplishments that we're proud of

We are proud that ClassPulse became more than just a dashboard.

Some of our favorite outcomes from the project:

  • Building a system that detects and surfaces student struggle in real time instead of after the fact
  • Creating a teacher experience centered on awareness and mobility, not screen dependency
  • Integrating smart-glasses alerts to make classroom analytics ambient and actionable
  • Generating AI explanation videos as an immediate support layer for students who are waiting for help
  • Designing the product around teacher augmentation rather than teacher replacement
  • Framing classroom analytics around actual learning flow—not just assignment completion

More than anything, we are proud of the core idea we proved: student confusion does not have to remain invisible.

What we learned

This project taught us that classroom intelligence is not just a technical problem—it is a human one.

A model can detect patterns, but a classroom runs on trust, timing, and presence. The best educational technology does not compete with the teacher for attention. It gives the teacher better awareness at the exact moment they need it.

We also learned:

  • understanding student process is much harder than evaluating outcomes
  • classroom UX matters just as much as model quality
  • glanceable interfaces can be more powerful than feature-heavy dashboards
  • real-time educational support only works if it fits naturally into existing teacher and student behavior
  • hackathon prototypes get much stronger when product design and system architecture reinforce each other

What's next for ClassPulse

We see this as the beginning of a much bigger platform for real-time learning support.

Next, we want to:

  • improve the accuracy of real-time confusion and progress detection
  • expand beyond math into additional subjects and classroom formats
  • deploy in real classrooms and learn from actual teacher workflows
  • add stronger analytics for recurring student misconceptions and class-wide trends
  • make AI-generated support more personalized and adaptive
  • deepen the smart-glasses experience so teachers can navigate alerts, summaries, and interventions even more fluidly

Why this matters

Every student has moments where they get stuck.

The problem is not getting stuck—it is staying stuck unnoticed.

ClassPulse helps teachers spot those moments earlier, respond faster, and support students before small misunderstandings become bigger learning gaps. We think that kind of visibility can fundamentally improve how classrooms work.

Built With

  • React
  • OpenAI
  • Gemini
  • Remotion
  • ElevenLabs
  • Smart glasses / AR interface tooling
  • Real-time screenshot analysis pipeline

Built With

Share this project:

Updates