Inspiration

We noticed that online learning often lacks interactivity and does not adjust based on a student’s engagement. This project was motivated by the idea that a student’s emotional state should influence how they learn. We built a system that adapts quiz difficulty based on facial expressions detected during video lectures.

What it does

Study Buddy includes the following features:

  • Emotion-based question filtering: Uses webcam input to detect real-time emotions while students watch learning materials. The quiz adjusts difficulty based on the dominant emotion detected.
  • Adaptive quiz generation: More confident expressions lead to harder questions, while less engaged expressions trigger easier ones. This keeps the quiz aligned with the student's state.
  • AI feedback: Each response is scored, and a confidence score is calculated per topic. This helps highlight which areas a student is strong or weak in.
  • Transcript-based learning: Students can choose to study through a video or a generated transcript. This supports different learning preferences and improves accessibility.
  • Distraction detection: The video pauses when signs of distraction are detected. This helps keep students focused during learning and prevents them from missing out the important lesson content.
  • Teacher & Student Roles: Teachers can upload questions, and students receive adaptive quizzes. Dashboards and analytics are planned to support both roles.

How we built it

  • The frontend was built using React, Tailwind CSS, and Material UI.
  • The backend is built with Express.js and handles API routes for authentication, AI scoring, and transcription. Study Buddy uses Firebase as the main database to store data such as users, classes, lessons, and progress.
  • Facial emotion detection was implemented through webcam-based analysis to identify dominant expressions during video playback.
  • The quiz engine filters questions by difficulty, adapting based on detected emotions.
  • AI scoring uses the Cohere API to calculate confidence levels by evaluating answer accuracy and question difficulty.
  • The transcript feature converts video content into text using AssemblyAI for audio transcription, though the integration is currently manual.
  • Due to time constraints, parts of the teacher dashboard and upload system were implemented using mock data or static displays.

Challenges we ran into

  • Calibrating the emotion detection thresholds for different conditions.
  • Keeping the video playback and quiz system in sync.
  • Handling the transcript feature with manual processes.
  • Maintaining correct login states across roles.
  • Backend integration was incomplete due to time constraints.
  • One laptop had performance issues during development.

What we learned

  • Emotion detection is a useful tool for adapting learning but needs careful tuning.
  • Adaptive quizzes are more effective when the question pool has sufficient range.
  • Managing UI state across roles is important for usability.
  • Working under time constraints helped us prioritize and collaborate efficiently.

What's next for Study Buddy

Future improvements include:

  • Adding a loop that gives students more practice on weaker topics.
  • Using AI to tag uploaded questions by topic and difficulty.
  • Completing the teacher dashboard for full performance tracking.
  • Expanding emotion and engagement detection beyond facial analysis.
  • Automating transcript generation from videos and audio inputs.
  • Supporting more subjects beyond language learning.

Built With

Share this project:

Updates