💡 Inspiration

Trash to the bin is the old days, bin to the trash is the new mindset. You toss a paper ball across the room. It misses the bin completely. Normally, you’d sigh, get up, and pick it up… but not today. The bin suddenly locks eyes (well, camera) on your trash, revs its little wheels, and races across the floor faster than Verstappen chasing pole. Before you know it, the trash is caught, and the bin is already on to its next mission. That silly moment sparked our idea: what if cleanup could be as exciting as an F1 race? What if all the trash yeeters actually have good aim? What if I didn't miss all my trash throwing shots at my last work term? I wonder...

🏎️ What it does

Meet Max Ver-Trash-Bin, the Formula 1–inspired smart trashcan. Powered by computer vision and Arduino-controlled wheels, Max detects falling or thrown trash and races toward it at lightning speed. Whether it’s a crumpled paper ball or an empty can, this bin won’t let your aim (or lack of it) go to waste.

🛠️ How we built it

We started by building the custom drivetrain of the trashcan. It consists of

  • two dc motors with custom encoders to track the position of each motor
  • an arduino to run the PID control loop
  • a raspberry pi for onboard computer vision computer
  • wide angle camera.

We then used computer vision, using Python and OpenCV to track incoming trash and produce a trajectory vector. The Arduino takes in the vector and moves the trash can in that direction.

To be inclusive of those that love driving manual cars, we have added a manual drive feature to experience the fun of collecting trash yourself

👾 Challenges we ran into

A major challenge was achieving precise control of the bin’s position. It was not enough for the bin to simply move in the right direction; it needed to arrive at an exact location to reliably intercept trash. This required working with degree-level accuracy for both rotation and linear movement, which introduced significant complexity in calibration and tuning. Moreover the speed of the trash can was an issue as aggressive and competitive trash yeeters would be unable to experience the joy of actually having an aim. These were all hardware constrained issues since the motor had a high-torque, low-speed gearbox and a low resolution optical encoder.

To help with this, we implemented a closed-loop PID feedback system that continuously monitored the bin’s position and adjusted motor outputs in real time. By integrating encoder data with control algorithms, we were able to correct for errors such as wheel slippage or overshooting. To solve the speed issue, we decided to gamble our life savings (hackathon hardwork) and over-voltage the motor and its driver by 5 volts. This gave us an astounding 50 RPM!!!!

Being a hardware project, we do not have a debug button IRL instead we had to move the robot from the testing area to a monitor, plug it in, and see what it sees. Any changed then had to be made on the robot, Cursor? Not a chance unless you want to spend an hour opening up a website and logging in on a slow single-board computer.

Finally the slow raspberry pi could only process frames at 10fps. This was no where near optimal to recognize, track, predict the velocity or trash and collect it. But we added partly skimmed around this issue by implementing a guessing feature into the model. This wouldn't be full robust but it's good for now!

🤖 Accomplishments that we're proud of

One accomplishment we are particularly proud of is successfully building an end-to-end system that connects computer vision, control algorithms, and hardware into a working prototype. Seeing the bin move in response to detected trash confirmed that our design decisions were effective and that the feedback loop approach worked in practice.

Beyond the technical aspects, we are proud that our project demonstrates how engineering can make everyday tasks both more efficient and engaging. Transforming a simple idea into a functional prototype under time constraints was a rewarding experience that highlighted the value of collaboration and creativity.

📈 What's next for Max Ver-Trash-Bin

Our next steps focus on two main areas of improvement:

  • Improved precision and projectile prediction: Refine motor control and calibration for smoother, more reliable positioning, while also developing trajectory estimation models so the bin can anticipate where trash will land and move proactively.
  • Enhanced detection: Train a more advanced model to distinguish between trash and non-trash objects, while also handling multiple items simultaneously.
  • Upgraded hardware (motors and onboard compute): This was our most restricting factor, fixing this would give us a huge tolerance on our software, and a better experience for our cherished trash yeeters.

Built With

Share this project:

Updates