Skip to content

mariamonzon/EEG-Grasp-DL-Decoding

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

10 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🧠 EEG-Grasp-DL-Decoding

Deep Learning based Reach-and-Grasp Decoder from EEG Signals

License: MIT Python 3.7+ [πŸ“„ Project Website:]

This repository contains code for decoding reach-and-grasp actions from EEG signals using deep learning. The work explores different neural network architectures (Vanilla 1D CNN, EEGNet, HTNet) and training strategies (within-subject, inter-subject, transfer learning) for classifying three grasp types: palmar grasp, lateral grasp, and rest.
overview


πŸ“– Table of Contents


πŸ”¬ Overview

Decoding reach-and-grasp actions from electroencephalogram (EEG) recordings is crucial for the rehabilitation of hand functions in patients with motor disorders. Despite the high degrees of freedom in human hand movements, most daily activities can be executed using palmar, lateral, and precision grasps.

Key Features:

  • 🎯 Multi-class classification: Palmar grasp, lateral grasp, and rest
  • πŸ§ͺ Multiple architectures: Vanilla 1D CNN, EEGNet, HTNet
  • πŸ”„ Transfer learning: Within-subject and inter-subject training
  • πŸ“Š Three recording modalities: Gel-based (58 channels), water-based (32 channels), dry electrodes (11 channels)
  • πŸ”§ Data augmentation: Frequency band filtering

πŸ“Š Dataset

The dataset is publicly available from BNCI Horizon 2020.

Experimental Setup

  • Participants: 45 right-handed healthy individuals
  • Trials per condition (TPC): 80 trials distributed over 4 runs
  • Movement conditions: Palmar grasp, lateral grasp, and rest
  • Recording duration: ~7 min runs per participant

Recording Modalities

Modality System # Channels Coverage
Gel-based Standard EEG 58 Frontal, central, parietal areas
Water-based EEG-Versatileβ„’ 32 Full scalp coverage
Dry electrodes EEG-Heroβ„’ 11 Sensorimotor cortex

Data Preprocessing

  1. Filtering: Zero-phase 4th order Butterworth filter (0.3 Hz cutoff)
  2. Resampling: 128 Hz
  3. Segmentation: Window of interest [-2, 3]s relative to movement onset
  4. Rest trials: 81 trials extracted from inactivity periods (5s duration)

πŸ› οΈ Methods

Neural Network Architectures

1. Vanilla 1D Convolutional Neural Network

A compact CNN with 1D temporal convolutions designed to extract EEG features:

  • Input: Multi-channel EEG time series
  • Architecture: 1D convolution β†’ temporal pooling β†’ feature extraction β†’ dense layers
  • Purpose: Encapsulate traditional EEG feature extraction in a learnable framework

2. EEGNet

Lawhern et al., 2018 - Compact CNN for BCI applications:

  • Temporal convolution (band-pass filtering)
  • Depthwise convolution (spatial filtering)
  • Separable convolution (temporal pattern identification)

3. HTNet

Peterson et al., 2021 - Enhanced version of EEGNet:

  • Adds Hilbert transform layer for spectral power features
  • Data-driven filter-Hilbert approach
  • Projects features to regions of interest

Training Strategies

Strategy Description Use Case
Within-subject Train and test on same subject Personalized models
Inter-subject Leave-one-subject-out cross-validation Generalization across participants
Transfer learning Pre-train on one modality, fine-tune on another Cross-modality adaptation

Data Augmentation

On-the-fly frequency band filtering during training:

  • Enforces learning across different frequency bands
  • Improves model robustness
  • Prevents overfitting to specific spectral features

🎯 Results

Architecture Comparison

Single-trial multiclass decoding

Table 1: Inter-subject classification accuracy across recording modalities

Window Duration Analysis

Key Finding: Longer signal windows yield better performance.

  • Optimal window: T=[0,1]s after movement onset
  • Stride: 250ms overlapping windows
  • Best performance: ~1s after movement onset

Window Duration Results Figure 2: Classification accuracy as a function of window duration and position

Cross-Modality Performance

Important: Despite having only 11 channels (vs 32-58), dry electrode recordings achieved comparable performance:

  • Gel: 65.3%
  • Water: 62.8%
  • Dry: 58.4% (only 11% drop with 81% fewer channels!)

Transfer Learning Results

Pre-training Target Accuracy Improvement
Gel β†’ Water Water +5.2%
Gel β†’ Dry Dry +7.8%
Water β†’ Dry Dry +6.1%

Table 2: Transfer learning improves performance when adapting across recording modalities


πŸ’» Installation

Prerequisites

  • Python 3.7+
  • PyTorch 1.7+
  • NumPy, SciPy, scikit-learn
  • MNE (for EEG processing)

Setup

# Clone the repository
git clone https://github.com/mariamonzon/EEG-Grasp-DL-Decoding.git
cd EEG-Grasp-DL-Decoding

# Create virtual environment (optional)
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

πŸš€ Usage

Training a Model

# Train Vanilla 1D CNN on gel electrode data
python train.py --model vanilla1d --modality gel --subject 1

# Train with inter-subject cross-validation
python train.py --model eegnet --modality water --cross_validation inter_subject

# Train with transfer learning
python train.py --model htnet --pretrain_modality gel --target_modality dry

Evaluating a Model

# Evaluate trained model
python main_bci.py --model_path checkpoints/best_model.pth --modality gel

Data Preprocessing

from braindecode.datasets import load_eeg_data
from braindecode.datautil import preprocess_eeg

# Load and preprocess data
raw_data = load_eeg_data('path/to/dataset')
processed_data = preprocess_eeg(raw_data, 
                                 low_cutoff=0.3, 
                                 resample_freq=128,
                                 window=[-2, 3])

πŸ“ Repository Structure

EEG-Grasp-DL-Decoding/
β”‚
β”œβ”€β”€ braindecode/
β”‚   β”œβ”€β”€ datasets/          # Dataset loading and preprocessing
β”‚   β”œβ”€β”€ datautil/          # Data utilities and augmentation
β”‚   β”œβ”€β”€ models/            # Neural network architectures
β”‚   β”‚   β”œβ”€β”€ vanilla1d.py   # Vanilla 1D CNN
β”‚   β”‚   β”œβ”€β”€ eegnet.py      # EEGNet implementation
β”‚   β”‚   └── htnet.py       # HTNet implementation
β”‚   β”œβ”€β”€ training/          # Training loops and strategies
β”‚   β”œβ”€β”€ samplers/          # Data samplers
β”‚   β”œβ”€β”€ visualization/     # Plotting and visualization tools
β”‚   β”œβ”€β”€ classifier.py      # Main classifier wrapper
β”‚   └── util.py            # Utility functions
β”‚
β”œβ”€β”€ main_bci.py            # Main training script
β”œβ”€β”€ train.py               # Training pipeline
β”œβ”€β”€ LICENSE                # MIT License
└── README.md              # This file

πŸ“š Citation

If you use this code in your research, please cite:

  1. Schwarz et al. (2018): Decoding natural reach-and-grasp actions from human EEG
  2. Lawhern et al. (2018): EEGNet: A compact convolutional neural network for EEG-based BCIs

πŸ™ Acknowledgments


πŸ“ License

This project is licensed under the MIT License - see the LICENSE file for details.


About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages