Project Phoenix 🔥

Emergency Beacon System for Disaster Relief

EquiTech Submission


Inspiration

Natural disasters don't discriminate, but their impact does. When Hurricane Maria devastated Puerto Rico in 2017, cellular networks went dark for months, leaving first responders unable to locate survivors trapped in collapsed buildings. When earthquakes strike, the elderly and disabled are disproportionately affected—unable to call for help when infrastructure fails. Equal access to emergency services shouldn't depend on working cell towers.

Project Phoenix was inspired by a simple but powerful question: What if your phone could still call for help when everything else fails?

We envisioned a system where:

  • A person trapped under rubble could broadcast their location without cell service
  • Rescue teams could locate survivors using just Bluetooth—technology present in every smartphone
  • The most vulnerable populations (elderly, disabled, those without smartphones) could still be found
  • Language barriers disappear—location data is universal

This is technology addressing inequality at its most fundamental level: equal access to life-saving rescue operations.


What it does

Project Phoenix is a dual-app Bluetooth Low Energy (BLE) beacon system that creates a mesh network for emergency location tracking when traditional infrastructure fails.

Emitter App (For Survivors/Victims)

The emitter transforms any smartphone into an emergency beacon that broadcasts:

  • Precise GPS coordinates (latitude, longitude, altitude)
  • Real-time sensor data (motion, fall detection, battery level)
  • Emergency flags (SOS activation, unstable environment detection)
  • Adaptive power management to preserve battery based on conditions

Binary Protocol: The beacon encodes sensor data into a compact 20-byte binary packet with the following structure:

Field Bytes Type Description
Device ID 4 uint32 Unique identifier
Latitude 4 float32 GPS latitude
Longitude 4 float32 GPS longitude
Altitude MSL 2 int16 Altitude above sea level (meters)
Relative Altitude 2 int16 Relative altitude (centimeters)
Battery 1 uint8 Battery percentage (0-100)
Timestamp 1 uint8 Seconds mod 256
Flags 1 uint8 Emergency state bitfield
Reserved 1 uint8 Reserved for future use

Receiver App (For Rescue Teams)

The receiver app provides an Apple AirTag-style precision finding interface that:

  1. Scans for nearby beacons using BLE with optimized settings:

    • Scan mode: LOW_LATENCY (fastest detection)
    • Match mode: AGGRESSIVE (maximum sensitivity)
    • Report delay: 0ms (immediate updates)
  2. Calculates real-time distance using RSSI-based path loss formula:

    distance = 10^((P₀ - RSSI) / (10 × n))
    
    • P₀ = measured power at 1 meter (-59 dBm)
    • RSSI = received signal strength
    • n = path loss exponent (2.0 for free space)
  3. Smooths RSSI readings using a weighted moving average with outlier rejection:

    • 10-sample moving average
    • IQR (Interquartile Range) outlier rejection
    • Linear weighting favoring recent values (wᵢ = i)
  4. Computes bearing to target using GPS coordinates and spherical trigonometry

  5. GPS fallback mode: When the BLE signal is lost (>3 seconds), switches to GPS-based distance using the Haversine formula (Earth's radius: 6371 km)


How we built it

Architecture

We built a monorepo containing two React Native apps with shared packages:

phoenix/
├── apps/
│   ├── emitter/          # Beacon transmitter
│   └── receiver/         # Beacon scanner with precision finding
├── packages/
│   ├── beacon-protocol/  # Binary encoding/decoding (shared)
│   ├── ui/              # Shared components
│   └── utils/           # Constants and helpers

Technology Stack

Frontend:

  • React Native 0.81 with Expo SDK 54
  • TypeScript for type safety
  • Custom UI with dark theme and glassmorphism design

Native Modules (Cross-Platform): We wrote custom native modules to access low-level Bluetooth and sensors:

Android (Kotlin):

  • BLEPeripheralManager.kt: BLE advertising with manufacturer data
  • BLEBeaconScanner.kt: BLE scanning with optimized settings
  • SensorDataModule.kt: GPS, accelerometer, gyroscope, altimeter, battery
  • NativeLogger.kt: Real-time logging from native → JavaScript

iOS (Swift + Objective-C):

  • BLEPeripheralManager.swift: Core Bluetooth peripheral mode
  • BLEBeaconScanner.swift: Core Bluetooth central mode
  • SensorDataModule.swift: CoreLocation, CoreMotion, CoreBluetooth integration
  • NativeLogger.swift: Native event bridge

Binary Protocol Design

We designed a custom 20-byte binary protocol for maximum efficiency over BLE:

Encoding (encoder.ts):

const buffer = new ArrayBuffer(20);
const view = new DataView(buffer);

// Device ID (4 bytes)
view.setUint32(0, deviceId, true);

// GPS coordinates (8 bytes total - float32)
view.setFloat32(4, latitude, true);
view.setFloat32(8, longitude, true);

// Altitude (4 bytes - int16)
view.setInt16(12, altitudeMSL, true);
view.setInt16(14, relativeAltitude, true);

// Battery & timestamp (2 bytes)
view.setUint8(16, battery);
view.setUint8(17, timestamp % 256);

// Flags (1 byte - bitfield)
view.setUint8(18, flags);

Manufacturer Data Format: To ensure cross-platform compatibility:

  • iOS emitter uses Company ID 0x004C (Apple)
  • Android emitter uses Company ID 0x0075 (Samsung)
  • Both receivers accept either format
  • Magic number 0x5048 ("PH") identifies Phoenix beacons

Power Optimization

We implemented adaptive transmission intervals to maximize battery life:

Condition Transmission Interval Reason
SOS or fall detected 1 second Emergency - fastest updates
Battery < 10% 15 seconds Critical - preserve remaining power
Battery < 20% 10 seconds Low battery - conserve energy
Motion detected 3 seconds Active - more frequent updates
Normal operation 5 seconds Balanced - standard rate

UI/UX Design

Precision Finding Interface:

  • Circular compass with N/E/S/W markers
  • Directional arrow that rotates to point at the target
  • Real-time distance in feet/inches (imperial units)
  • Color-coded proximity levels:
    • Blue (far: >5m)
    • Orange (medium: 1.5-5m)
    • Light green (near: 0.5-1.5m)
    • Green (here: <0.5m)

Dark theme throughout:

  • Background: #000 (true black)
  • Cards: rgba(255, 255, 255, 0.05) (glassmorphism)
  • Text: #FFF primary, #999 secondary
  • Subtle borders and shadows for depth

Challenges we ran into

1. Cross-Platform BLE Protocol Incompatibility

Problem: iOS and Android handle BLE manufacturer data differently. iOS requires the company ID to be included in the data payload, whereas Android automatically adds it.

Solution: We wrote platform-specific encoding:

  • iOS: [CompanyID: 2] [Magic: 2] [Data: 20] = 24 bytes
  • Android: [Magic: 2] [Data: 20] = 22 bytes (CompanyID added by OS)

Both receivers now parse either format.

2. RSSI Instability

Problem: Raw RSSI values fluctuate wildly (±20 dBm), making distance calculations unusable.

Challenge encountered: The simple moving average wasn't enough. Outliers caused significant jumps.

Solution: Multi-layer smoothing:

  • Moving average (10 samples)
  • Statistical outlier rejection (IQR method)
  • Weighted averaging (recent values prioritized)
  • Result: Stable distance readings with <10cm jitter

3. Arrow Direction Accuracy

Problem: Arrow kept pointing north instead of toward the beacon.

Debug process:

  • Added detailed logging showing bearing calculations
  • Discovered we were setting rotation directly in styles while also trying to animate
  • The animation value wasn't being used!

Solution: Use Animated.Value with interpolation for smooth rotation:

transform: [{
  rotate: rotateAnim.interpolate({
    inputRange: [0, 360],
    outputRange: ['0deg', '360deg']
  })
}]

4. Precision Finding "Kick Out" Issue

Problem: Users were kicked back to the list when the BLE signal was temporarily lost.

Solution: GPS fallback mode—after 3 seconds without BLE, switch to GPS-based distance tracking. Users remain in precision finding mode, and the distance continues to update.

5. State Transition Jitter

Problem: Proximity levels (far/medium/near/here) flickered as the user approached the beacon.

Solution: Hysteresis with smart behavior:

  • Instant transition when getting closer (no delay)
  • 15cm hysteresis when moving away (prevents flickering)
  • State changes only when the distance difference exceeds the threshold

6. Haptic Feedback Complexity

Problem: Constant vibration was annoying; the lack of vibration made it hard to use eyes-free.

Solution: Progressive haptic feedback based on distance:

Distance Range Vibration Interval Description
< 0.5m (here) No vibration Target reached - stop
0.5m - 1.5m (near) 0.7s Fast pulses
1.5m - 3m (medium) Progressive (1-2.5s) Slows as distance increases
> 3m (far) Silent Too far for haptics

Vibration stops completely when within 0.5m, so rescuers know they've arrived.


Accomplishments that we're proud of

1. Infrastructure-Independent Emergency System

We built a system that works when nothing else does. No cell towers, no WiFi, no internet—just Bluetooth, which is on every phone. This levels the playing field for disaster victims regardless of location or resources.

2. Production-Ready Native Code

We wrote 5,713 lines of production-quality native code (Kotlin + Swift) with:

  • Robust error handling
  • Memory leak prevention
  • Proper lifecycle management
  • Real-time logging for debugging

3. Apple AirTag-Level UX

Our precision finding interface rivals Apple's proprietary U1 chip technology, but works on any Bluetooth-enabled device:

  • Smooth 250ms updates (4 fps)
  • Sub-meter accuracy in ideal conditions
  • Intuitive compass-based navigation
  • Progressive haptic feedback

4. Mathematical Rigor

Every algorithm is grounded in mathematics:

  • Path loss formula for RSSI-to-distance
  • Haversine formula for GPS fallback
  • IQR statistical outlier rejection
  • Weighted moving averages
  • Spherical trigonometry for bearing calculations

5. Cross-Platform Compatibility

  • iOS → iOS ✅
  • iOS → Android ✅
  • Android → iOS ✅
  • Android → Android ✅

Any Phoenix device can find any other Phoenix device.

6. Accessibility-First Design

  • Haptic feedback: Progressive vibration pulses guide visually impaired users (every 2s at 3m → every 0.3s at 0.5m)
  • GPS fallback: Continuous tracking even when the BLE signal is lost
  • Offline-first: Works completely without internet or cell service
  • Language-independent: Location data transcends language barriers
  • Low battery mode: Adaptive transmission intervals extend operation time
  • Universal compatibility: Works on any BLE-capable device (smartphones, smartwatches, fitness trackers)

What we learned

1. Bluetooth Low Energy is More Complex Than We Thought

We initially thought BLE would be straightforward, but learned:

  • Manufacturer data format differs between iOS and Android
  • RSSI values are unreliable without heavy filtering
  • Scan settings dramatically affect power consumption and detection speed
  • Background BLE requires careful permission handling

Key insight: The "low energy" in BLE isn't automatic—you need adaptive algorithms to truly preserve battery.

2. Signal Processing is Critical for Real-World Use

Raw sensor data is noisy. We learned:

  • Simple averaging isn't enough
  • Statistical methods (IQR) are essential
  • Weighted averages balance responsiveness and stability
  • Outliers can destroy user experience

Key insight: The math between sensor and screen is as important as the sensor itself.

3. Native Modules Bridge Two Worlds

Writing native modules taught us:

  • How React Native's bridge works
  • Memory management differences (ARC vs. manual)
  • Thread safety in async operations
  • Event emitters for native → JS communication

Key insight: Native code is unavoidable for cutting-edge features, but TypeScript provides safety at the boundary.

4. UX Testing Reveals Hidden Problems

Testing with phones side-by-side revealed:

  • Arrow pointing wrong direction (style vs. animation conflict)
  • Jittery state transitions (needed hysteresis)
  • Annoying constant vibration (needed progressive feedback)
  • Confusion when kicked out of precision finding (needed GPS fallback)

Key insight: Real-world testing catches what simulators miss.

5. Accessibility Requires Intentional Design

Equal access doesn't happen by accident. We learned:

  • Haptic feedback needs tuning (not just on/off)
  • Visual and tactile feedback should complement each other
  • Offline-first is essential for true accessibility
  • Simple UI serves everyone better

Key insight: Designing for edge cases (visually impaired, elderly, low-resource) makes the product better for everyone.

6. Documentation is Code

We learned that good documentation is as important as good code:

  • README for users
  • SETUP.md for developers
  • Inline comments for maintainability
  • Mathematical notation for algorithms

Key insight: If you can't explain it, you don't understand it well enough.


What's next for Project Phoenix

Short-Term Enhancements

1. Mesh Networking Currently, receivers find emitters directly. Next: emitters relay each other's signals, creating a mesh network that extends range exponentially (Range = BLE_range × number_of_hops).

2. Multi-Device Triangulation Use multiple receivers to triangulate the exact position using least-squares optimization, combining distance measurements from multiple receivers to pinpoint the exact location.

3. Machine Learning for RSSI Calibration Train models to improve distance accuracy based on:

  • Environment (indoor/outdoor)
  • Obstacles (walls, debris)
  • Device type (antenna differences)

4. Emergency Services Integration API for 911 dispatch systems to:

  • See all active Phoenix beacons on a map
  • Prioritize based on flags (SOS, fall, battery)
  • Coordinate multi-team search efforts

Medium-Term Vision

5. Smartwatch Support Extend to wearables (Apple Watch, Galaxy Watch) for:

  • Always-on broadcasting (longer than phone)
  • Fall detection with accelerometer
  • Heart rate monitoring in emergencies

6. Drone Integration Autonomous drones that:

  • Scan large areas for Phoenix beacons
  • Map survivor locations from the air
  • Drop supplies to confirmed positions

7. Building Information Modeling (BIM) Import building blueprints to:

  • Provide 3D navigation to survivors
  • Account for walls/floors in RSSI calculations
  • Suggest safe evacuation routes

Long-Term Impact

8. Open Standard Make Phoenix protocol an open standard for emergency beacons:

  • Work with FEMA, Red Cross, UN
  • Standardize manufacturer data format
  • Pre-install on smartphones (like emergency SOS)

9. Global Deployment Partner with relief organizations to deploy in:

  • Earthquake-prone regions (Japan, California, Nepal)
  • Hurricane zones (Caribbean, Gulf Coast)
  • Conflict areas (where infrastructure is deliberately destroyed)

10. IoT Ecosystem Extend beyond phones:

  • Medical alert devices
  • Child safety trackers
  • Elderly care pendants
  • Vehicle emergency beacons

The Ultimate Goal

Make rescue operations accessible to everyone, everywhere.

In 5 years, we envision:

  • Phoenix is pre-installed on every smartphone
  • International emergency frequency for Phoenix protocol
  • 10M+ saved lives in disasters
  • Equal access to rescue, regardless of infrastructure

Why This Matters for EquiTech

Project Phoenix directly addresses inequality in emergency response:

Access Inequality

  • Rich areas: Working cell towers, well-funded emergency services
  • Poor areas: Collapsed infrastructure, delayed response times
  • Phoenix: Equal access—Bluetooth works everywhere

Disability Inequality

  • Able-bodied: Can call 911, wave for help, navigate to safety
  • Disabled/elderly: May be immobile, non-verbal, or disoriented
  • Phoenix: Automatic broadcasting, fall detection, passive rescue

Technology Inequality

  • Latest phones: GPS, cellular, satellite SOS (iPhone 14+)
  • Older phones: Just Bluetooth
  • Phoenix: Works on any BLE device since 2011

Language Inequality

  • English speakers: Can communicate with rescuers
  • Non-English: Language barriers delay rescue
  • Phoenix: Location data is universal

Equal access to rescue isn't a luxury—it's a human right. Phoenix makes it reality.


Technical Appendix

Codebase Statistics

$ ./count-lines.sh

TypeScript           2,516 lines
TypeScript (JSX)     2,303 lines
JavaScript             746 lines
Kotlin               2,952 lines
Swift                2,473 lines
Objective-C            288 lines
JSON                   556 lines
Markdown               524 lines

Total Source Code:  11,278 lines

Repository Structure

phoenix/
├── apps/
│   ├── emitter/              # Beacon transmitter app
│   │   ├── native-modules/   # Kotlin & Swift modules
│   │   └── src/             # TypeScript app code
│   └── receiver/            # Beacon scanner app
│       ├── native-modules/  # Kotlin & Swift modules
│       └── src/
│           └── components/
│               └── PrecisionFindingView.tsx  # Main UI
├── packages/
│   ├── beacon-protocol/     # Binary encoding/decoding
│   ├── ui/                 # Shared components
│   └── utils/              # Constants
└── count-lines.sh          # LOC counter

Key Files

  • packages/beacon-protocol/src/encoder.ts - Binary packet encoding
  • packages/beacon-protocol/src/decoder.ts - Binary packet decoding
  • apps/receiver/src/components/PrecisionFindingView.tsx - Precision finding UI (511 lines)
  • apps/receiver/src/services/BeaconScanner.ts - RSSI smoothing & GPS fallback (433 lines)
  • apps/emitter/src/services/BeaconTransmitter.ts - Adaptive transmission (303 lines)

Dependencies

  • React Native 0.81
  • Expo SDK 54
  • TypeScript 5.9
  • expo-location (GPS)
  • react-native-safe-area-context (UI)

License

MIT License - Open source for humanitarian use


Project Phoenix: Because equal access to rescue is a human right, not a privilege.

Built with ❤️ for emergency response and disaster relief operations

Share this project:

Updates