Inspiration

What if you had telekinesis — but for technology?

In Stranger Things, Eleven moves objects without touching them. We asked: what if humans could interact with computers the same way? No mouse. No keyboard. Just intent and motion.

AirClick was inspired by sci-fi interfaces where technology responds naturally to human movement. We wanted to rethink how people interact with devices and explore what computing feels like when touch is optional, even across multiple devices.


What It Does

AirClick is a gesture-based control system that lets you control any device using just your hands.

With simple, intuitive gestures, you can:

  • Move the cursor
  • Click and scroll
  • Take screenshots
  • Perform system actions
  • Copy and paste
  • Copy content on one laptop and paste it instantly onto another, without any physical connection or Apple Ecosystem

AirClick works through either a Raspberry Pi that you can connect via Bluetooth or alternatively, a lightweight software you can install, and you’re ready to go. No keyboard. No mouse. No remote. Just motion.


Why This Matters

AirClick isn’t just a flashy demo — it solves real problems.

This technology can be used by:

  • Elderly users who struggle with traditional mice or trackpads
  • People with disabilities that limit fine motor control
  • TVs and shared screens, when you can’t find the remote
  • Situations where touch isn’t ideal, like cooking, eating snacks, or working hands-free

By removing the need for physical input devices, AirClick makes computing more accessible, flexible, and natural.


How We Built It

AirClick is built using a combination of computer vision, hardware, and real-time networking:

  • Computer vision to detect and classify hand gestures in real time
  • Python for gesture logic, device control, and networking
  • WebSockets to synchronize actions like copy and paste across devices instantly
  • Raspberry Pi as a plug-and-play hardware module that enables gesture control on any laptop

The Raspberry Pi runs the gesture recognition locally and sends instructions to the device that it is connected with via Bluetooth.


Challenges We Ran Into

  • Gesture recognition was extremely sensitive, requiring extensive tuning to reduce false positives
  • First-time experience working with hardware introduced a steep learning curve
  • Achieving reliable, low-latency communication across multiple devices required careful coordination between client and server

Accomplishments We’re Proud Of

  • Built a fully functional hardware + software system under hackathon constraints
  • Achieved real-time, cross-device copy and paste
  • Made gesture control feel natural, responsive, and intuitive
  • Learned and integrated entirely new tools under time pressure

What We Learned

  • Real-time hand tracking and gesture recognition
  • Hardware-software integration using Raspberry Pi
  • Real-time communication using WebSockets
  • Debugging complex systems involving vision, networking, and hardware

What’s Next for AirClick

  • Expand the gesture set for deeper system control
  • Improve accuracy, customization, and user calibration
  • Support more devices and operating systems
  • Explore accessibility, smart TV, and AR/VR use cases

Built With

Share this project:

Updates