Skip to content

SuhritP/Excite

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

14 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

EXCITE

EXCITE

Eye-tracking Classification for Intelligent Therapeutic Evaluation

AI-powered ADHD screening through eye movement analysis
Screen in minutes, not months.

Python PyTorch Flask OpenCV


๐Ÿง  About

EXCITE is pioneering a new frontier in ADHD detection. Our mission is to develop an accessible, non-invasive screening tool that can identify ADHD indicators through eye movement analysis.

Traditional ADHD diagnosis relies on subjective behavioral assessments and lengthy clinical evaluations. EXCITE offers a different approachโ€”using the eyes as a window into cognitive function. By analyzing how individuals track visual stimuli, our system can detect patterns associated with attention disorders in minutes, not months.

โœจ Features

Feature Description
๐Ÿ”ฌ Non-Invasive Screening No blood tests, no brain scans. Simply watch dots on a screen.
โšก Rapid Results Complete screening in under 5 minutes with instant AI analysis.
๐Ÿ“ฆ Portable Hardware Runs on Raspberry Piโ€”deploy anywhere for ~$50.
๐Ÿง  Research-Backed Trained on 12,000+ eye-tracking recordings from GazeBase.
๐Ÿ‘ Real-Time Visualization Watch gaze patterns as the AI analyzes eye movements.
๐Ÿ”’ Privacy-First All processing happens locally. Data never leaves the device.

๐Ÿ—๏ธ Architecture

+------------------------------------------------------------------+
|                         EXCITE Pipeline                          |
+------------------------------------------------------------------+
|                                                                  |
|   Camera           CV Model            ML Model                  |
|   --------  --->   -----------  --->   -----------  --->  Result |
|   Eye Video        Gaze Extraction     ADHD Detection            |
|                    (eyetrack.py)       (inference.py)            |
|                                                                  |
|   Extracts:                 Analyzes:                            |
|   - Pupil position          - Saccade velocity                   |
|   - Pupil diameter          - Fixation stability                 |
|   - Gaze coordinates        - Gaze entropy                       |
|   - Validity flags          - Temporal patterns                  |
|                                                                  |
+------------------------------------------------------------------+

๐Ÿ“ Project Structure

Excite/
โ”œโ”€โ”€ webapp.py               # ๐ŸŒ Web application (Flask + beautiful UI)
โ”œโ”€โ”€ app.py                  # ๐Ÿ“ฑ Basic web interface
โ”œโ”€โ”€ eyetrack.py             # ๐Ÿ‘ Computer vision pipeline (pupil detection)
โ”œโ”€โ”€ inference.py            # ๐Ÿง  ADHD detection model
โ”œโ”€โ”€ train.py                # ๐Ÿ‹๏ธ Model training script
โ”œโ”€โ”€ run_prediction.py       # ๐Ÿ”ฎ Batch prediction utility
โ”‚
โ”œโ”€โ”€ src/                    # Core modules
โ”‚   โ”œโ”€โ”€ data/               #   Data loading (GazeBase, ADHD datasets)
โ”‚   โ”œโ”€โ”€ features/           #   Feature extraction (velocity, BCEA, etc.)
โ”‚   โ”œโ”€โ”€ models/             #   ML models (Transformer encoder)
โ”‚   โ”œโ”€โ”€ training/           #   Training pipeline
โ”‚   โ””โ”€โ”€ inference/          #   Real-time detection
โ”‚
โ”œโ”€โ”€ checkpoints/            # ๐ŸŽฏ Trained model weights
โ”‚   โ”œโ”€โ”€ adhd_model.pt       #   ADHD classifier
โ”‚   โ”œโ”€โ”€ encoder_pretrained.pt   Pre-trained encoder
โ”‚   โ””โ”€โ”€ best_classifier.pt  #   Best performing model
โ”‚
โ”œโ”€โ”€ config/                 # โš™๏ธ Configuration files
โ”œโ”€โ”€ scripts/                # ๐Ÿ”ง Training scripts (GPU cluster)
โ”œโ”€โ”€ data/                   # ๐Ÿ“Š ADHD training sequences
โ”œโ”€โ”€ uploads/                # ๐Ÿ“ค User uploaded videos
โ””โ”€โ”€ outputs/                # ๐Ÿ“ Processing results

๐Ÿš€ Quick Start

Installation

# Clone the repository
git clone https://github.com/SuhritP/Excite.git
cd Excite

# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

Run the Web App

python webapp.py
# Open http://localhost:5000 in your browser

Command Line Usage

from inference import ADHDDetector

# Initialize detector
detector = ADHDDetector()

# Feed eye tracking data frame by frame
for frame in video_frames:
    x, y, pupil_diameter = extract_gaze(frame)
    result = detector.add_frame(x, y, pupil_diameter, validity=0)
    
    if result:
        print(f"ADHD Probability: {result['probability']:.1%}")
        print(f"Prediction: {result['prediction']}")

๐Ÿ”ฌ Technology

Computer Vision Pipeline

Our custom CV system extracts eye movement features in real-time:

  • Pupil Detection: Dark-region search with ellipse fitting
  • Gaze Estimation: Maps pupil position to visual angle
  • Glint Tracking: Corneal reflection for calibration
  • Blink Detection: Validity flagging for data quality

Transformer Neural Network

At the heart of EXCITE is a state-of-the-art Transformer encoder:

ADHDModel(
  (proj): Linear(5, 64)           # Input projection
  (transformer): TransformerEncoder(
    (layers): 3x TransformerEncoderLayer(
      d_model=64, nhead=4, dim_feedforward=128
    )
  )
  (head): Linear(64, 2)           # Classification head
)

Two-Stage Training

  1. Pre-training: Learn general gaze dynamics from GazeBase (12,000+ recordings)
  2. Fine-tuning: Specialize on ADHD detection with labeled clinical data

ADHD-Relevant Biomarkers

Biomarker Description ADHD Pattern
Saccadic Velocity Speed of eye jumps Faster but less accurate
Fixation Stability (BCEA) Eye wobble during fixation Higher variability
Microsaccade Rate Tiny involuntary movements Elevated during tasks
Gaze Entropy Randomness of gaze patterns Higher unpredictability
Pupil Dynamics Diameter changes Altered responses

๐Ÿ“Š Datasets

GazeBase Data Repository

Griffith, H., Lohr, D., Abdulin, E., & Komogortsev, O. (2020)

Large-scale, multi-stimulus, longitudinal eye movement dataset.

https://doi.org/10.6084/m9.figshare.12912257

ADHD Pupil Size Dataset

Krejtz, K., et al. (2018)

Pupil size dataset for ADHD research.

https://doi.org/10.6084/m9.figshare.7218725

๐Ÿ–ฅ๏ธ CV Model Output Format

Your eye tracker must provide these values per frame:

Parameter Type Description
x float Gaze X position (degrees visual angle)
y float Gaze Y position (degrees visual angle)
dP float Pupil diameter (pixels or mm)
val int Validity flag (0=valid, 1=blink/lost)

๐Ÿ‹๏ธ Training

Pre-train on GazeBase

# Quick test
python train.py pretrain --rounds 1 --max_subjects 10

# Full training
python train.py pretrain --rounds 1 2 3 --epochs 50

Fine-tune on ADHD Data

python train.py finetune --pretrained_path checkpoints/encoder_pretrained.pt

GPU Cluster Scripts

# Setup environment
./scripts/setup_cluster.sh

# Run training
./scripts/train_full.sh

โš ๏ธ Disclaimer

EXCITE is a screening tool designed to complementโ€”not replaceโ€”professional clinical evaluation for ADHD.

This software is for research and educational purposes. Always consult qualified healthcare professionals for medical diagnoses.

๐Ÿ“„ License

MIT License - see LICENSE for details.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •