We will be undergoing planned maintenance on January 16th, 2026 at 1:00pm UTC. Please make sure to save your work.

Inspiration

ADHD affects over 360 million people worldwide, yet diagnosis often takes months of subjective behavioral assessments and expensive clinical evaluations. We asked: what if we could use the eyes—windows to cognitive function—to detect attention disorders in minutes?

Research shows that individuals with ADHD exhibit distinct eye movement patterns: more frequent saccades, reduced fixation stability, and altered pupil responses. We built EXCITE to harness this science.

What it does

EXCITE is a non-invasive ADHD screening tool that analyzes eye movements in real-time. Users simply watch dots on a screen while our system:

  1. Tracks eye movements using computer vision (pupil detection, gaze estimation)
  2. Extracts biomarkers like saccade velocity, fixation stability, and gaze entropy
  3. Runs AI inference through a Transformer neural network trained on 12,000+ eye-tracking recordings
  4. Delivers results with probability scores and detailed analytics

How we built it

  • Computer Vision Pipeline: Custom OpenCV-based pupil detection with ellipse fitting and glint tracking
  • ML Model: PyTorch Transformer encoder pre-trained on GazeBase dataset, fine-tuned on ADHD-specific data
  • Web Interface: Flask backend with a modern, animated frontend featuring real-time visualization
  • Edge Deployment: Optimized to run on Raspberry Pi (~$50 hardware)

Challenges we ran into

  • Achieving accurate pupil detection under varying lighting conditions
  • Balancing model complexity with inference speed for real-time analysis
  • Creating a UI that's both beautiful and functional for clinical settings
  • Training on limited ADHD-labeled data while maintaining generalization

Accomplishments we're proud of

  • End-to-end pipeline from raw video to ADHD probability in under 30 seconds
  • Beautiful, YC-caliber web interface with neural network visualization
  • Privacy-first design—all processing happens locally on device
  • Portable solution that can be deployed in schools, clinics, or homes

What we learned

  • The fascinating connection between eye movements and cognitive function
  • Transfer learning techniques for medical AI with limited labeled data
  • How to build production-ready ML pipelines that work on edge devices

What's next for EXCITE

  • Clinical validation studies to establish sensitivity/specificity benchmarks
  • Mobile app version for broader accessibility
  • Integration with existing telehealth platforms
  • Expanding to other attention-related conditions

Built With

Share this project:

Updates