👋 Hands-Free Accessibility Through Head Movement & Voice

People with quadriplegia often face barriers when interacting with computers. Inspired by our passion for accessibility and Computer Vision, we created SightSync, a system-level accessibility tool that allows users to control their computer using only head movements and voice commands.

🧠 What It Does

SightSync uses real-time head tracking through a webcam and natural voice commands to move your cursor and keyboard inputs at the OS level. With no need for physical input devices, users can:

  • Navigate applications
  • Interact with files
  • Control the entire system environment
  • All hands-free with just head movements and voice commands

🔧 How We Built It

We built a desktop application that runs locally and interprets head movement using Computer Vision techniques (MediaPipe and mouse optimization). A face mesh follows the user’s nose area, tracking horizontal and vertical movements to move the on-screen pointer.

On the voice side, we integrated a speech recognition engine that listens in for custom commands such as:

  • "click"
  • "scroll down"
  • "open [application]"
  • "type [text]"

these translate items them into corresponding OS-level actions.

Key Components:

  • Webcam-based Facial
  • Real-Time Voice Commands
  • OS Input (Mouse/Keyboard Hooks)
  • ML Model Noise-Resistant Command Parsing

💪 Challenges We Ran Into

  • Computer Vision Pupil Tracking: We experimented with pupil tracking but faced consistency issues and out-of-bounds tracker errors.

  • Voice Recognition Accuracy: Ensuring accurate voice command recognition in noisy environments pushed us to explore hybrid offline/online models.

  • Latency: Low-latency interaction was critical. We worked hard to minimize lag between head movement and pointer response for a seamless experience.

🙌 Accomplishments We're Proud Of

  • Enabled fully hands-free desktop control for users with limited mobility.
  • Achieved sub-100ms latency in real-time head tracking using optimized computer vision pipelines.
  • Built a modular command system for flexible application control.
  • Created a product that’s genuinely useful for anyone not just quadriplegic users.

🧠 What We Learned

  • Accessibility isn't just a feature, it's a fundamental design priority.
  • Real-time applications depend heavily on performance optimization and input smoothing.
  • Users with physical disabilities need intuitive systems that reduce mental, not just physical, effort.
  • Designing for accessibility means going beyond "what works" to "what feels empowering."

🌱 What’s Next for SightSync

We're just scratching the surface. Upcoming goals include:

  • Adding eye tracking support for more precise control
  • Developing a mobile version for tablets and smartphones
  • Exporting SightSync as an exe for cross-platform integration
  • Integrating AI macros to automate tasks like:
    • Opening emails
    • Joining meetings
    • Even giving voice-driven presentations

We believe SightSync can be more than just an accessibility tool it can be a path toward digital freedom for anyone with limited mobility.

Built With

+ 11 more
Share this project:

Updates