Inspiration

Our goal was to explore new interactive uses for Computer Vision, with a focus on helping those who face limitations in accessing technology. We found that while there are apps that use voice recognition or motion tracking, few cater to those who can't use all their limbs. Hence, we created MindMouse.

What it does

Our project tracks the user's face in real-time and converts the horizontal tilt and vertical motion into data to navigate the cursor on the computer. Users can click by closing their eyes until they hear a confirmation sound.

How we built it

MindMouse was built using Python and libraries such as OpenCV, PyAutoGUI, Numpy, Playsound, and Math. The program uses OpenCV to detect the user's face and calculates the difference in height between the eyes to determine the tilt direction. PyAutoGUI is used to move the cursor. Eye closure is detected to register clicks.

Challenges

Initially, we planned to use head location, but this required full-body movement, defeating the purpose. So, we tried using face orientation, but this was not effective. Our final solution was to use face tilt and keep the head steady.

Proud Accomplishments

We're proud of using a small number of libraries to solve a complex task, and doing the math to determine angles and thresholds for cursor control. Our project can detect head motion accurately and efficiently with minimum delay in cursor movement. It works on all computer devices and can be implemented into the OS as well, ensuring interoperability. Furthermore, it doesn't require expensive eye-tracking hardware, just a webcam, making the model accessible and inexpensive. We implemented audio support to let user know when they clicked on a button. No speech Input required to make it more accessible for a larger target audience.

What we learned

We gained a deeper understanding of the OpenCV library, HAAR Cascading, and accessing the UI to move elements such as the cursor and clicks.

Future Plans

Our future plans include developing a custom cascading method for a more accurate detection of motion and presence. We also plan to integrate a virtual keyboard and mouse acceleration based on the extent of head tilt or displacement.

Built With

Share this project:

Updates