Inspiration
Inspiration We were inspired to make technology more accessible for everyone, especially people with disabilities. Many individuals, like those who are paralyzed, face challenges in using traditional computer tools like a mouse or keyboard. We wanted to create a solution that not only empowers them but also makes everyday life easier for everyone, including students, professionals, and coders.
What it does Our project is a hands-free mouse and voice assistant designed with accessibility and innovation in mind. The eye-tracking system replaces a physical mouse, allowing users to control the cursor by looking at the screen and clicking by blinking. This is life-changing for people with disabilities, like those who are paralyzed or have limited mobility, enabling them to navigate computers independently. For everyday users, the system improves multitasking and eliminates the need for a physical mouse. Our voice assistant (Hoya) can open websites like YouTube, Google, or Amazon with simple commands, adding a layer of hands-free convenience to the experience.
How we built it We built the eye-tracking system using OpenCV and MediaPipe, we used facial landmarks to track eye movement and detect blinks. The voice assistant was developed using a combination of Python libraries, integrating speech recognition and automation tools to handle voice commands. We also used PyAutoGUI to simulate mouse movements and actions. To make the system user-friendly, we initially designed a GUI in Tkinter, which was later refined for a smoother experience.
Challenges we ran into Accuracy in eye-tracking: Calibrating the system to accurately detect where users are looking was tricky and required extensive testing. Blink detection: Differentiating intentional blinks for clicks from natural blinking was a challenge that needed fine-tuning. Voice command integration: Making the voice assistant reliable while minimizing false triggers was another obstacle we overcame through debugging and optimization. Balancing accessibility and functionality: We needed to ensure the system worked seamlessly for both people with disabilities and everyday users. boldAccomplishments*bold* that we're proud of Successfully creating a fully functional hands-free mouse that works for both accessibility and productivity purposes. Building a voice assistant that can execute commands like opening websites effortlessly. Developing a project that can genuinely empower people with disabilities, like those who are paralyzed, giving them independence. Combining two advanced technologies—eye-tracking and voice control—into one cohesive, user-friendly system. What we learned How to integrate multiple technologies, like OpenCV, MediaPipe, and speech recognition, into one project. The importance of user testing, especially when designing for accessibility. How to balance technical complexity with user-friendly functionality. The challenges and possibilities of creating technology that makes a real difference in people’s lives. What’s next for Hoya Helper Improved eye-tracking accuracy: Incorporating calibration settings to personalize the system for individual users. Expanded voice commands: Adding more advanced features like controlling media playback, sending emails, or navigating documents. Mobile compatibility: Adapting the system for smartphones and tablets to make it even more accessible. Accessibility upgrades: Including additional features for users with other disabilities, like customizable click sensitivity or eye-control typing. Open-source release: Sharing the project with the developer community to inspire others to build on it.
Built With
- cv
- mediapipe
- pyaudio
- python
- speech-recognition
- tkinter
Log in or sign up for Devpost to join the conversation.