Inspiration
All of us love playing volleyball, and setting was something that took a really long time for us to figure out. There are so many factors to take into consideration, and practising on your own doesn't cut it because things have to work seamlessly with the attacker as well. We wanted to make it easier to practice setting by provide an app that determines a good set, and finds what would make it better.
What it does
VBall Trainer uses object and skeleton detection to determine how "good" your set is—whether your form needs tweaking, if the ball is too high, or if the path of the ball is off. VBall Trainer runs a setting drill, and analyzes your form and the curve of the ball to help you improve!
How we built it
We used Mediapipe for object and skeleton detection, which would locate and track certain points on your body (eg. thumbs, elbows, torso, legs) as well as the ball. We trig bashed coordinates of the elbows, shoulders, and thumbs to check the setting form, and tested all kinds of gestures to make sure that we were able to properly determine proper setting form. In order to model the 3D trajectory, we took pictures of the volleyball at different distances. Then, we could use linear regression to see where a ball would be at a certain time to model a 3D trajectory.
Challenges we ran into
We went through some turbulence when training the model—the datasets we were using weren't the most accurate so the model wasn't as clean as we wanted it to be. Moreover, the wifi speed affected the upload time, which wasn't the most convenient given the time crunch we were in. The set detection also required a lot of 3D math that was a bit of a pain to work out. We ended up having to train the model multiple times—sometimes it got so frustrating that we wanted to change direction altogether. Another obstacle that really set us back was making a phone work as a webcam for us. The "Continuity Camera" that apple had never worked for us. We ended up having to switch to a third-party app that finally worked for us.
Accomplishments that we're proud of
We were really happy when the set detection finally came together. We needed to make sure the measurements were scaled so that our code would work for people of different body shapes standing different distances from the camera, and we also needed to make our code would work no matter which direction the setter was facing. Also, since the phone webcam nightmare took us so long to fix, we were so relieved when it started to work.
What we learned
In frontend, we became more familiar with PyQt5, and all the resources that were available to us. We had to become more familiar with trig and Mediapipe to detect the sets, and integrating the frontend with the backend was a learning experience as well. We learned the importance of finding a balance between working together and divide and conquer to work in the most efficient way possible.
What's next for VBall Trainer
Currently VBall Trainer only trains for setting and bumping. We'd like to expand it to train spiking, digging, and more!
Built With
- cv2
- mediapipe
- pyqt5
- python
- qtdesigner
Log in or sign up for Devpost to join the conversation.