Enhancing Safety, Communication, and Navigation for Firefighters
Fire Guard is a smart firefighter helmet that fuses LiDAR, radar, IMU, RGB camera, and microphone data with AI-powered perception and mapping. It generates real-time SLAM and semantic maps, penetrates smoke where cameras fail, and transcribes speech to text for clear communication.
- SLAM Mapping → LiDAR + radar fusion for navigation in smoke-filled environments
- Semantic Mapping → Object detection (YOLOv5) + LLM (DistilGPT-2) to classify rooms
- Communication → Whisper for real-time speech-to-text transcription
- Sensor Fusion → LiDAR (geometry), Radar (penetrative sensing), IMU (localization), Camera (CV)
- HUD Display → Real-time awareness inside the helmet
As a past volunteer firefighter, one of our teammates experienced the challenges of limited visibility, disorientation, and noisy radio communication during real emergencies. Fire Guard was inspired by the idea that better technology can save lives.
-
Sensors
- LiDAR (Slamtec C1) → 2D mapping
- Radar (24 GHz FMCW) → depth through smoke
- IMU (BNO055) → drift correction & localization
- RGB Camera → YOLOv5 object detection
- Microphone → Whisper speech-to-text
-
AI Components
- YOLOv5 → Object detection
- DistilGPT-2 → Room labeling via semantic mapping
- Whisper → Speech-to-text
-
Networking
- Custom ad-hoc router system over 802.11 Wi-Fi
- Multi-device data transfer via sockets and ports
We used the Slamtec C1 LiDAR as the core of our SLAM system:
- 360° scanning range for building accurate 2D maps
- High sampling rate for smooth point cloud generation
- Compact + lightweight design, making it helmet-friendly
- Works with the BNO055 IMU to correct drift and improve localization
- Provides precise obstacle detection for navigation and mapping
Key Features:
- Builds detailed geometric maps of walls and obstacles
- Serves as the base layer for our fused SLAM pipeline (LiDAR + radar)
- Essential for generating the semantic room maps used for communication
We integrated a commercial-grade 24 GHz FMCW radar module to enhance perception in low-visibility environments:
- ADF5901 – 24 GHz transmitter (1 TX)
- ADF5904 – 24 GHz receiver (4 RX) enabling MIMO radar
- ADAR7251 – 4-channel, 16-bit ADC digitizing radar returns
- ADF4149 – PLL + VCO generating precise chirps
- ADSP-BF700/BF701 – Blackfin DSP for real-time processing
Key Features:
- Provides depth perception through dense smoke
- Supports range + velocity detection
- Feeds radar heatmaps into our LiDAR + IMU SLAM pipeline
- Required configuring and studying 300+ pages of documentation to operate
The BNO055 IMU provided orientation and motion tracking for localization:
- 9-axis sensor fusion combining accelerometer, gyroscope, and magnetometer
- Onboard sensor fusion processor for drift-free orientation
- Provides real-time motion data to correct LiDAR drift
- Lightweight and designed for embedded systems
- Critical for stabilizing the mapping pipeline and ensuring accurate firefighter tracking
- Multi-sensor fusion is powerful but challenging
- Radar requires significant documentation and setup to configure properly
- Lightweight LLMs can improve communication by labeling rooms
- Real-time pipelines under hardware constraints push you to think creatively
- Radar setup: only worked on Windows due to driver limitations
- Embedded constraints: Jetson TX2 Nano lacked proper support
- Real-time fusion: combining multiple data streams under time pressure
- Speech transcription in noisy environments
- Port to a newer Jetson Nano for tighter integration
- Optimize networking for real-time field deployment
- Extend semantic mapping with more attributes (e.g., color, material)
- Improve HUD for clearer real-time awareness
- Bao Doung
- Jacob Diep
- Jeremy Ky
- John Rankin