EdWin — Ed(ucational) Win

The only AI that actually went to your school.


Inspiration

There's a specific kind of panic that sets in around week 6 of a university semester. You're sitting in a 300-person lecture hall, the professor is three slides deep into something you're pretty sure connects to something from last week — or maybe last semester — and you realize you've been staring at the board for ten minutes without absorbing a single thing. You open your laptop to catch up on the notes you missed. Now you're behind on the live lecture. You write down a term you don't recognize. You'll Google it later. You don't Google it later.

This is the quiet, ordinary way that university students fall behind — not dramatically, not all at once, but in the accumulation of a hundred small moments of confusion that never quite get resolved. A concept slips by in lecture. It shows up on the midterm. You lose marks you didn't know you were losing.

We've all been that student. Sitting in CSC258 wondering why none of the logic gates are clicking, not realizing the entire foundation was covered in MAT137 and you actually know this — you just don't know that you know it. Sitting in STA237 half-checked-out during the estimators unit because it's "not really on the midterm," unaware that it's the single most important concept for STA238 next semester. Paying $25/month for a ChatGPT subscription that has never heard of your professor, has never seen your syllabus, and confidently gives you wrong information about your own course.

The problem isn't that students are lazy or unprepared. The problem is that the tools available to them are completely blind to their context. Generic AI assistants don't know what school you go to, what year you're in, what courses you've already taken, or how your specific professor weights their exams. They have no memory of what you've struggled with before. They can't tell you that this concept showed up on four of the last six midterms. They can't connect the dots between your past, present, and future coursework.

We built EdWin because students don't need another generic chatbot. They need something that actually knows them.


What It Does

EdWin is a context-superaware, memory-persistent AI learning companion built specifically for university students. The core thesis is simple but radical: an AI is only as useful as the context it has access to. Most AI tools give you a brilliant mind with amnesia. EdWin gives you a brilliant mind that has been sitting next to you in every lecture since September.

Persistent Memory That Actually Means Something

EdWin builds and maintains a longitudinal profile of each student — not just a list of courses, but a living model of their academic journey. It tracks which concepts have been encountered, how many times they've been reinforced across different courses, and how deeply they've been covered. There's a meaningful difference between a concept you've seen once on a slide and a concept you've worked through across three problem sets and two courses. EdWin knows the difference. This is what makes its guidance feel less like a generic search result and more like advice from someone who has been paying attention.

Context-Aware In-Lecture Alerts

The centerpiece of EdWin is its real-time in-lecture tool. As a lecture unfolds, EdWin surfaces contextual alerts at exactly the right moments — not interruptions, but quiet signals that orient the student within their own learning history:

  • "This logic content connects directly to what you covered in MAT137 — you already have the foundation for this."
  • "Heads up: this question archetype has appeared on 4 of the last 6 midterms for this course."
  • "Estimators aren't heavily tested here, but they're foundational for STA238, which is on your plan for next semester."

These alerts are powered by two inputs that most AI tools never see: the course syllabus and past midterms. EdWin ingests both at setup, extracts a concept graph from the syllabus, and reasons over historical exams to understand what actually gets tested and how. The result is a system that can distinguish between what's important for today's grade and what's important for the student's degree.

A Priority Algorithm Built Around the Whole Student

EdWin's priority engine weighs every piece of course content on two axes simultaneously. First: how much does this topic matter for the student's performance in this course right now, based on the syllabus weighting and historical exam presence? Second: how much does this topic matter for the student's future coursework based on their degree plan? This dual-axis model means EdWin can tell you when to lean in because something is being underemphasized in lecture and overrepresented in evaluations — and when to pay attention to something that won't show up on the midterm but will define your next two years.

Professor Confusion Heatmap

EdWin includes a lightweight professor-facing dashboard that aggregates confusion signals from across the student cohort and surfaces them as a topic heatmap. No student data, no surveillance — just a clear signal about where the class is struggling so instructors can decide whether to allot more time in the next session. It's a closed loop between student experience and teaching adjustment that currently doesn't exist.

Onboarding That Actually Learns You

Before Edwin can help, it gets to know you. A short onboarding flow — 3 to 4 questions — seeds your learning style profile. Are you someone who needs analogies? Worked examples? First-principles derivations? This profile doesn't just inform the chatbot; it shapes how every explanation, alert, and recommendation is framed across the entire platform.


How We Built It

  • Frontend: React-based UI with a live transcription panel, real-time alert feed, and integrated chatbot interface — all designed to feel like a natural extension of the lecture experience rather than a separate tool.
  • AI & LLM Layer: Claude (Anthropic) powering the context-aware chat, alert generation, and concept extraction, with persistent student memory managed via structured session and profile state passed through each API call.
  • Syllabus & Exam Ingestion: PDF upload pipeline that parses course syllabi into a concept graph and indexes past midterms for question archetype identification.
  • Priority Algorithm: Custom scoring logic weighting current-course importance (syllabus coverage %, historical exam presence) against longitudinal importance (cross-course concept dependency graph seeded from degree plan data).
  • Professor Dashboard: Confusion signals aggregated from in-lecture student input and rendered as an interactive spider/radar map by topic cluster.
  • Transcription: Live lecture audio transcription feeding the real-time context engine so alerts trigger at the right moment in the right lecture.

Challenges We Ran Into

The hardest problem wasn't any single technical piece — it was context management at scale. Persistent, meaningful memory across sessions, courses, and semesters is a genuinely hard engineering problem. Every API call to the LLM is stateless by default; building the illusion — and the reality — of a system that remembers you required careful design of what gets stored, how it gets retrieved, and how it gets compressed into a prompt that still fits within a context window.

Parsing syllabi turned out to be messier than expected. University course documents are not standardized. Some are PDFs with clean structure. Some are scanned images. Some are Google Docs links. Building a robust ingestion pipeline that could extract a reliable concept graph from that variety was a significant chunk of the build.

Real-time alert timing was also non-trivial. Firing an alert too early or too late in the lecture flow breaks the illusion of intelligence. Getting the transcription-to-alert pipeline latency low enough to feel genuinely live required careful optimization.


Accomplishments That We're Proud Of

  • Built a working, demo-ready product in a single hackathon sprint.
  • The alert system actually works — it fires contextually relevant signals based on real transcript content cross-referenced against course materials.
  • The dual-axis priority algorithm produces outputs that feel genuinely insightful rather than generic.
  • The professor heatmap is clean, real, and immediately actionable — a professor can look at it and know exactly what to address next class.
  • Named it EdWin. Ed(ucational) Win. We're proud of that too.

What We Learned

We learned that context is the entire product. Strip away the persistent memory and the syllabus ingestion and the exam history, and you have a generic chatbot wrapper. Keep them in, and you have something that feels like a genuine leap. The difference between "AI that helps with school" and "AI that knows your school" is everything.

We also learned that students are the most underserved segment in the AI tools market right now. The tools that exist are built for professionals with expense accounts. Students are paying $25/month for something that has never heard of their university, their professor, or their degree requirements. The gap between what exists and what's needed is enormous — and entirely addressable.


What's Next for EdWin

  • Quercus + Piazza Integration: Embed EdWin directly into the LMS tools UofT students already use. Zero friction adoption.
  • Lecture Replay Mode: After a lecture ends, students can replay the session with EdWin's alerts overlaid at the exact timestamps they fired. A powerful revision and retention tool.
  • ML Confidence Scoring: Move from rule-based alert triggers to a model that outputs a confidence score on alert relevance — so the system gets smarter the more it's used.
  • Expanded University Rollout: The architecture is university-agnostic. UofT is the pilot. Every university with a Quercus deployment is a potential partner.
  • Professor-Initiated Content Flagging: Let professors tag content in their own slides as high-priority for exam purposes, feeding directly into EdWin's alert engine with verified signal rather than inferred signal.
  • The Free Trial Model: A free "up to midterm" trial funded by seed capital to get students in the door. After that, $10/month to cover API costs. Students are broke — but they will absolutely pay $10/month for better grades.

Built With

Share this project:

Updates