Inspiration

Often in science-fiction movies, we see people typing on holographic keyboards, floating in mid-air. AirBoard's experience of typing on any surface is inspired by such movies, along with the virtual keyboard of Apple's Apple Vision Pro.

Problem Statement

In today's fast-paced digital world, traditional keyboards can be limiting, especially for individuals with physical disabilities or those seeking more ergonomic solutions. The need for innovative input mechanisms that offer flexibility, accessibility, and efficiency is more pressing than ever. AirBoard addresses this challenge by transforming hand gestures into keystrokes, offering a seamless and intuitive typing experience without the need for physical keys.

What it does

AirBoard allows typing without a physical keyboard or even a flat surface. It requires only a webcam pointed at the user's hands and a desktop app, making it easy to set up and use.

How we built it

Backend

  • Python & MediaPipe: Utilizes MediaPipe Hand Landmarker for precise real-time hand tracking, capturing 21 key points per hand.
  • Gesture Recognition: Processes hand landmarks to differentiate key presses using finger height and movement patterns.
  • Keystroke Simulation: Uses PyAutoGui to emulate keyboard inputs based on interpreted gestures.
  • WebSocket with FastAPI: Ensures low-latency communication between backend and frontend for real-time data transmission.

Frontend

  • SvelteKit: Developed for a dynamic, responsive UI that visualizes live video, hand data, and keystrokes.
  • Real-Time Updates: Integrates WebSocket for instant data display, with robust error handling and reconnection logic.
  • User-Friendly Design: Provides intuitive feedback and interaction, enhancing the virtual typing experience.

By integrating these technologies, AirBoard offers a seamless and innovative solution for virtual typing

Challenges we ran into

Initially, we tried to fully support touch typing. However, our initial approach of training AI models on sample images did not yield satisfying results, only predicting one character out of five correctly.

Accomplishments that we're proud of

One of our biggest concerns was not being able to type on air. However, after tuning the limits of the height where the keys would be detected, we could type in mid-air! This is possible since we're not tracking any environmental data from the pictures taken; we only look at the hands independently of the surface they're on.

What we learned

A good dataset for AI training must have at least some variety. Without variations of each key press, there is no way for the model to train on data that would resemble its final implementation, as real data would necessarily have some randomness.

Why AirBoard is Exceptional

  • Accessibility: By eliminating the need for physical keys, AirBoard opens up new possibilities for individuals with mobility challenges, making digital interaction more inclusive.
  • Ergonomics: Reducing the strain associated with traditional typing, AirBoard offers a more natural and comfortable way to interact with digital devices.
  • Innovation: Combining state-of-the-art technologies, AirBoard represents a significant leap forward in human-computer interaction, paving the way for future innovations in virtual input methods.

What's next for AirBoard

AirBoard could be integrated into augmented reality products, such as glasses, making it possible to type just as well in AR as on a real keyboard. AirBoard can also help make typing more accessible, as it can be used in a variety of positions and on many, if any, surfaces, making it more versatile than a normal keyboard.

Built With

Share this project:

Updates