Blast-OIS: The AI-Powered Fire Sentinel
Inspiration
We are living at the intersection of digital perception and physical reality. While AI has become incredible at "seeing" the world, it is often a passive observer. In a fire, every second of passivity costs lives. We wanted to move beyond just detection; we wanted to create Embodied Intelligence that directly augments human capability by taking the first line of defense.
Firefighting is one of the most dangerous professions on Earth. Every year, thousands of first responders face extreme heat, toxic fumes, and collapsing structures. We saw an opportunity to bridge this gap by building a robot that doesn't just call for help, but actually stays in the room to fight the flames when it's too dangerous for a human to do so.
What it does
Blast-OIS (Object Intervention System) is an autonomous firefighting turret. It transforms a standard webcam and a water pump into an intelligent responder.
The system operates in a constant loop:
- Detection: A custom YOLO model identifies fire at the "edge" with millisecond precision.
- Tracking: It calculates the fire's position in 3D space and maps it to physical motor coordinates.
- Actuation: A high-torque servo steers the nozzle, and a water pump engages once a "lock-on" is confirmed.
By acting as a "haptic co-pilot" for safety, it can suppress small blazes (like a forgotten candle) before they escalate into a full-scale structural fire, potentially saving billions in property damage and preventing tragedies.
Social Impact: Protecting Those Who Protect Us
The core value of Blast-OIS isn't just technology—it's safety.
- Risk Mitigation: Robots don't breathe in toxic smoke or suffer from heat exhaustion. By deploying automated sentinels, we can keep human firefighters out of "Flashover" zones where survival is measured in seconds.
- 社区 (Community) Resilience: In remote or under-resourced communities, a fast-acting AI turret can provide the critical "first strike" against a fire while human responders are still in transit.
- The Courage of Silicon: Robots are willing to take risks humans shouldn't have to. They can remain inside a burning chemical plant or a unstable building, providing continuous suppression and real-time data back to the human commanders outside.
How we built it
Blast-OIS is a distributed system consisting of three main components:
1. The Brain (NVIDIA Jetson Orin Nano)
We took a custom dataset of candle flames, labeled it in Roboflow, and trained a YOLO model. On the Jetson, we used a Python script to process the video feed. To translate the fire's -pixel coordinate into a physical degree for the servo, we used linear interpolation:
2. The Bridge (Tailscale VPN & UDP)
Because we wanted a wireless system, the Jetson communicates with a MacBook (our hardware hub) over a Tailscale private network using a low-latency UDP protocol. This allowed us to send serialized commands (like D:90:P:1) instantly across the building.
3. The Muscle (Arduino Uno & Servo)
The MacBook relays the commands via Serial to an Arduino Uno. The Arduino manages a high-pressure water pump and a servo motor. The nozzle is mounted directly on the servo, which sits on top of the camera, creating a perfectly aligned "point-and-shoot" mechanism.
| Component | Role | Technology Used |
|---|---|---|
| Vision | Object Detection | YOLO, OpenCV, Roboflow |
| Inference | Edge Computing | NVIDIA Jetson Orin Nano |
| Network | Remote Comms | Tailscale VPN, Python Sockets |
| Actuation | Physical Response | Arduino, C++, Servo, Water Pump |
Challenges we ran into
The "Jitter" Problem: Raw AI coordinates are often "noisy." If the bounding box flickered, the servo would vibrate. We solved this with a Low-Pass Filter for smoothing:
Calibration: Mapping a 2D digital image to a 3D physical space required meticulous measurement. We had to determine the exact degree offsets to ensure the water stream hit the center of the flame, not just the general area.
The "Grace Period": We realized that turning the pump off the instant the fire disappeared wasn't effective—embers could reignite. We engineered a "Grace Counter" to keep spraying for 15 frames after the fire was gone to ensure complete suppression.
Accomplishments that we're proud of
We are incredibly proud of successfully building a real-time, multi-platform control loop. Getting a Jetson to "talk" to a MacBook via a VPN, which then "talks" to an Arduino to move a physical object, all within milliseconds, was a massive networking win. Seeing the turret "lock on" and douse a flame for the first time was an unforgettable moment.
What we learned
The biggest lesson was that Embodied AI is about more than just models. You can have the best YOLO model in the world, but if your networking latency is high or your servo smoothing is poor, the system fails. We learned that the "Network is the bottleneck"—optimizing the data pipeline between the Jetson and the Arduino was just as important as the AI training itself.
What's next for Blast-OIS
- Thermal Fusion: Adding an IR camera to see through smoke, which is the #1 challenge for human firefighters.
- Swarm Defense: Creating multiple turrets that can "hand off" a target to one another as it moves through a building.
- Fire-Class Awareness: Training the AI to recognize the difference between a grease fire (don't use water!) and a wood fire, allowing the robot to choose the correct extinguishing agent.
Built With
- arduino
- camera
- cv
- jetson-nano
- ml
- python
- relay
- servo

Log in or sign up for Devpost to join the conversation.