The inspiration for Touchboard came from a need for a better form of virtual keyboard. We saw that there were a couple problems with virtual keyboards where they are fixed in positions and sometimes the sizes don't fit finger size. We walked around looking at hardware and we thought that a relative keyboard would be an interesting hack. Something that could change size depending on the distances between fingertips and the sizes of hands. The Touchboard is made to let the user determine their most relaxing finger and hand positions and create a virtual keyboard that conforms to that. It was made to recognize left and right hands from any positions, including upside-down or slanted, and map the keys according to that. We built it through Python and Synaptics touch board and it was built using angles and formulating virtual keyboards from the zones that have the sensors activated. Challenges we ran into include: figuring out the angle of movement when changing hand positions, having the initial touchscreen work when downloading the driver. Our team is proud of being able to use a new piece of hardware none of us had ever worked with before. Also, understanding the backend of the hardware and learning how it works with python to create a server port is another accomplishment we’re proud of. We learned how the touchscreen uses a 2D array to take in fingerprints and then converted these pixel locations to characters of a keyboard. We aim to create a Touchboard version 2.0 that can scale character sizes on the board when we rotate our hands instead of having a static board that changes direction.

Built With

Share this project:

Updates