Inspiration

The inspiration started with a pun and spiraled out of control. We realized that most line-following robots are boring perfectionists, they follow the line perfectly or fail miserably. We wanted a robot with personality. Named after the galaxy’s most reliable co-pilot (R2-D2), R2Detour was born from our desire to build a bot that might take the scenic route but always gets the job done. Plus, we really just wanted an excuse to make a robot that screams when it loses the line.

What it does

R2Detour is an autonomous line-following robot that navigates high-contrast tracks (coloured tape on white floors) using infrared sensors. But it’s not just a blind follower; it uses a PID (Proportional-Integral-Derivative) control loop to adjust its motor speed dynamically, ensuring smooth turns rather than jerky "bang-bang" movements.

When it detects an intersection or a gap in the line, it makes "decisions" (mostly correct ones) to find its way back. If it gets completely lost, it spins in a circle and flashes a "Help Me Obi-Wan" LED sequence.

How we built it

We Frankenstein-ed this droid together using:

  • The Brains: An Arduino Uno running our custom C++ navigation code.
  • The Eyes: A 5-channel IR Sensor Array (TCRT5000) to detect track reflectivity.
  • The Muscle: Two high-torque DC motors driven by an L298N H-Bridge motor driver.
  • The Power: A rechargeable 9V Li-Po battery pack.
  • The Chassis: Laser-cut acrylic that we definitely measured twice and cut once (okay, maybe we cut three times).

We wrote a custom PID algorithm to calculate the error value based on which sensors were triggering, allowing the robot to correct its course in microseconds.

Data Analysis with MongoDB Atlas

We didn't just want a robot; we wanted a data-driven droid. To debug our PID logic remotely, we piped real-time telemetry data (motor PWM values, sensor states, and error coefficients) directly into MongoDB Atlas.

  • Time Series Collections: We utilized Atlas's native Time Series collections to efficiently store high-frequency sensor data. This allowed us to ingest thousands of data points per run without worrying about schema rigidity or write-latency bottlenecks.
  • Remote Visualization: Using MongoDB Charts, we built a dashboard to visualize the "wobble" (oscillation) of the robot. This allowed us to tune our Kp and Kd values based on historical run data rather than guessing.
  • Aggregation Pipelines: We wrote custom aggregation pipelines to analyze run efficiency, identifying exactly which track segments caused the highest battery drain and calculating the standard deviation of our line-hold accuracy.

On-Chain Verification with Solana

To prevent "modification" of race times (cheating), we implemented a "Proof of Run" system using the Solana Blockchain.

  • NFT Minting: Upon completing the course, R2Detour triggers a camera to snap a "finish line" photo. We use the Metaplex standard to mint this image as an NFT directly on the Solana mainnet.
  • Immutable Metadata: Crucially, the run's metadata (total time in milliseconds, average velocity, and the PID configuration hash) is injected into the NFT's JSON metadata schema.
  • Decentralized Storage: The actual image asset is stored on Arweave for permanent, censorship-resistant hosting, while the hash is committed to Solana. This ensures that our high score is cryptographically verifiable and immutable. We chose Solana for its high throughput and low gas fees, ensuring we don't go bankrupt minting test runs.

Challenges we ran into

  • We learned the hard way that IR sensors hate natural and reflective light. In bright rooms, R2Detour thought the entire room was a black line. We had to build a custom "skirt" out of electrical tape to shield the sensors.
  • PID Tuning Hell: Tuning the PID constants (Kp, Ki, Kd) was like trying to balance a pencil on its tip during an earthquake. For hours, the bot either oscillated like it had had too much coffee or reacted so slowly it drove straight off the table.
  • Cable Management: At one point, it looked less like a robot and more like a spaghetti monster. A liberal application of zip ties saved the day.

Accomplishments that we're proud of

  • We achieved a stable PID loop! Watching it hug a curve without wobbling for the first time felt better than winning the lottery.
  • We managed to overclock the PWM signals to get 15% more speed out of the motors without frying the L298N driver.
  • Best of all, it didn't explode: always a win in hardware hacking.

What we learned

  • Hardware is hard. Unlike code, you can't Ctrl+Z a burnt motor driver.
  • For sensor noise, we learned how to implement a running average filter to smooth out noisy sensor data.
  • For physics, friction and momentum are real, and no amount of elegant code can fix a wheel that keeps falling off.

What's next for R2Detour

  • Obstacle Avoidance: Adding an ultrasonic sensor so R2Detour can stop or navigate around physical objects (Imperial Stormtroopers).
  • Machine Learning: We want to upgrade to a Raspberry Pi Pico to implement TinyML, allowing the bot to recognize traffic signs or colour codes on the track.
  • Holograms: Okay, maybe just an OLED display to show sassy error messages.

Built With

Share this project:

Updates