Inspiration
FPV experiences are becoming increasingly popular — they let you see through the eyes of a remote machine and interact with the real world from anywhere. But whether it's drones or RC vehicles, the barrier to entry remains high. Setups cost hundreds of pounds, require technical expertise to build and operate, and often can't be used in everyday spaces like a university campus or a living room.
We asked ourselves: what if we could take that same immersive, first-person experience and make it affordable, accessible, and genuinely fun — something you can pick up and play with friends, no expertise required?
That's how ROBO was born.
What it does
ROBO is a physical FPV combat experience that blends real-world driving with on-screen gameplay. The system has two parts: a gun controller you hold, and a tank that drives around in the real world.
The gun controller uses a built-in gyroscope to track your hand movements. Rotate the gun left or right, tilt it up or down — and a crosshair on your screen moves to match. You aim at targets on-screen by physically moving the controller, just like pointing a real weapon. A joystick on the controller drives the tank forward, backward, and steers it left and right. A second joystick acts as the trigger — pull it to shoot.
The tank is your eyes on the ground. A phone mounted on it streams live video to your laptop, giving you a first-person view of the environment. You drive the tank to reposition yourself in the real world, then aim and shoot at targets using the gyro-controlled crosshair overlaid on the video feed.
The result feels like playing a first-person shooter — except the world is real, the vehicle is real, and your controller is a physical gun you aim with your hands.
Every session also generates synchronized training data — camera frames paired with the operator's control inputs — creating a ready-made dataset for imitation learning. The system captures how humans perceive their environment, plan movements, and make split-second aiming decisions in real-world spaces, turning every player into a teacher for future AI models.
How we built it
The system is split across three nodes that communicate in real time:
Gun Controller (Pico + BMI160 IMU + 2 Joysticks) The operator holds a physical gun with a gyroscope/accelerometer sensor that tracks yaw (left/right rotation) and roll (up/down tilt) using a complementary filter. These angles control the crosshair position on the laptop screen. One joystick handles tank driving — forward, backward, and steering. The second joystick acts as the fire trigger. The Pico reads all inputs at 50Hz, packages them into a JSON data packet, and transmits wirelessly via the nRF24L01+ transceiver.
Tank (Pico + L298N + DC Motors) The second Pico receives wireless commands and drives two DC motors through an L298N H-bridge for skid steering. The tank moves through the real environment based on the joystick input from the gun controller. A phone mounted on the tank streams live video over WiFi to the laptop, providing the FPV feed.
Laptop (Python + PyGame + PyOpenGL) The laptop ties everything together. It receives the live video feed from the phone on the tank (via DroidCam over WiFi), overlays the gyro-controlled crosshair for aiming, and handles the game logic — target detection, hit registration, and scoring. It also hosts a 3D visualiser built with PyGame and PyOpenGL for debugging the gun controller orientation in real time. The laptop communicates with the gun controller Pico over USB serial.
The wireless link between the two Picos uses nRF24L01+ transceivers over SPI, with the gun controller Pico also connected to the laptop via USB serial for the game overlay and visualiser.
Challenges we ran into
Yaw drift: The BMI160 doesn't have a magnetometer, so yaw tracking relies on gyroscope integration which drifts over time. We implemented a dead zone filter and calibration routine, but long sessions still accumulate error. A re-center button on the controller was our practical workaround.
Wireless reliability: Getting the nRF24L01+ link stable between two Picos took more debugging than expected. Packet loss during the demo was our biggest fear, so we kept the data packet small and the transmission rate at 50Hz with error handling for dropped frames.
Motor driver compatibility: We originally planned to use an HW-130 (L293D shield with shift register), but it was designed for Arduino headers. We switched to the HW-095 (L298N module) which connects directly to the Pico via GPIO — a much cleaner solution that only needs 6 wires.
Power management: Running motors, radio, and the Pico from a single battery pack required careful voltage regulation and keeping the motor power isolated from the logic power to prevent brownouts.
Accomplishments that we're proud of
- Built a complete FPV game system with a custom physical controller from scratch in 24 hours
- Achieved smooth, responsive gyro-to-crosshair mapping that feels natural and intuitive to aim with
- Reliable wireless communication between two Picos at 50Hz with minimal packet loss
- Seamless integration of real-world driving with on-screen aiming — blending physical and digital gameplay
- Won first place
What we learned
- Prove the wireless link first — everything depends on it, and it's the hardest thing to debug under time pressure
- Keep the first prototype small enough to demo the core loop, then add features
- Direct GPIO motor drivers (L298N) are far simpler than shift-register-based shields when working with non-Arduino microcontrollers
- Complementary filters are surprisingly effective for IMU fusion on resource-constrained hardware
- A clear problem statement matters as much as the technical build
What's next for ROBO
- Physical turret — add pan/tilt servos to the tank so the camera follows where you aim, making the FPV view fully responsive to the gun controller
- Autonomous mode — dual-brain obstacle avoidance using an ultrasonic sensor for close-range reflexes and computer vision for strategic path planning
- VR headset integration — replace the laptop screen with FPV goggles for full immersive telepresence
- Multi-tank arena — multiple tanks in the same space, each controlled by a different player, competing in real-time combat
- Imitation learning pipeline — train a neural network on the collected human demonstration data to create a fully autonomous opponent that plays like a human
- Miniaturisation — shrink the entire system into a consumer-ready kit that costs under £50
Log in or sign up for Devpost to join the conversation.