I-M Possible 2021

Inspiration

Online Classes inspired us the most for this project. So here goes the story!

Everybody is locked in. Online classes are the norm. Everybody is bored. Its time to stream some movies and chill out.

But of course how do you chill out ? Sitting or lazying around in awkward positions.

Sorry to be a spoil sport , but that effects your spine and muscles!!

So what did we do about it ? Find out!!

What it does

It's a software that uses existing tech (webcams ) on laptops , and integrates them with hand gestures. So now you can control the media using only your hands.

And not just media , we also made one for the internet.

So now you can swish around as much as you like and feel like Professor X from x-men ( the guy who can move stuff using his finger alone)

How we built it

We chose python ,for its diversity in library selections. We mainly used MediaPipe. We used the pre trained hand and body recogniser from mediapipe and used a bit of math to figure out the proper co-ordinate mappings.

Challenges we ran into

We ran into a few problems on our way.

  • Configuration of co-ordinates for web scrolling (a bit of math)
  • Media Player adaptation (different stuff works on different player (vlc,quick time etc) )
  • Getting things to work the way we want them too

Accomplishments that we're proud of

We were able to browse the web and consume media in a better manner using our program. We were also able to work through our differences and code errors. We worked as a team and here is the finished product. We hope it can bring a little change to the world.

What we learned

We learnt a lot about mediapipe and how computer vision works. The mapping of our hands to different co-ordinates and the precision of the system really amazed us.

What's next for UseAbylity

We aim to package this software into an easy to install , working piece of software which can hopefully correct peoples posture and also be convenient to them.

Built With

Share this project:

Updates