Inspiration
Natural disasters kill over 60,000 people annually and cause $280 billion in economic losses worldwide. During the critical 72-hour rescue window after a flood, earthquake, or hurricane, the biggest killer isn't always the disaster itself — it's the chaos of coordination. Emergency managers are drowning in fragmented data across 6+ agencies while simultaneously trying to direct resources, and civilians on the ground have no reliable way to find the nearest open shelter, hospital, or food point in real time.We were inspired by a simple but devastating observation: the information needed to save lives during a disaster already exists — river gauge readings, hospital capacity, shelter availability, rescue team locations — but it lives in silos that no human can process fast enough under pressure. We asked: what if AI could hold all of that simultaneously and act on it in seconds? What it does CrisisIQ is a dual-module AI-powered disaster response platform built for both emergency coordinators and civilians. SafeReach (Civilian Module) — A mobile-first interface that guides affected civilians to the nearest open shelter, hospital, or food distribution point in real time. It detects the user's location, filters by need (shelter, medical, food), shows live capacity bars from real data, and generates AI-powered safe route guidance avoiding flooded or dangerous roads. Users can send an SOS with a single tap and receive an immediate AI-dispatched emergency response. It supports any location across the USA and works for people with limited connectivity via a simplified interface. CrisisIQ Coordinator (Command Module) — A real-time dashboard for emergency managers that pulls live flood alerts from the National Weather Service API and river gauge readings from the USGS Water Services API for any location in the USA. It tracks hospital capacity, shelter availability, and active SOS events across all resources. Coordinators can generate an AI-powered SITREP (Situation Report) in seconds that synthesizes live flood data, resource status, and alert severity into a professional briefing — the kind that normally takes a human team 45 minutes to produce. The two modules are connected — SOS events from civilians appear instantly as critical alerts in the coordinator dashboard, creating a live feedback loop between the people in danger and the people directing the response.
How we built it We built CrisisIQ entirely in React with Vite as the build tool and no backend server. The tech stack is intentionally lean to maximize speed of development during the hackathon. Frontend — React with inline styles for full design control, organized into a three-panel layout: shelter list, interactive map, and AI response panel. All location data lives in a separate shelters.csv file loaded at runtime so the data is completely decoupled from the component logic. Map — Leaflet.js with react-leaflet renders a CARTO dark tile map with custom color-coded markers for hospitals, shelters, and food points. The map re-centers dynamically as users search locations or select shelters. Geocoding — Nominatim (OpenStreetMap's free geocoding API) converts any typed address worldwide into coordinates with no API key required. Live Flood Data — Two free government APIs with no authentication needed. The National Weather Service API (api.weather.gov) provides real-time flood alerts filtered to the user's state using coordinate-based zone detection. The USGS Water Services API provides river gauge readings within a 140-mile radius of the user's location, color-coded by flood stage severity. AI Layer — Claude API (claude-sonnet-4-5) powers three distinct workflows: civilian route guidance with turn-by-turn directions and safety tips, emergency SOS dispatch with survival instructions, and coordinator SITREP generation that incorporates live NWS and USGS data into a professional situation report.
Challenges we ran into CORS and browser-side API calls — Calling the Anthropic API directly from the browser required the anthropic-dangerous-direct-browser-access header, which isn't obvious from the documentation. We spent time debugging silent failures before discovering this requirement. Leaflet map re-centering — React-Leaflet's flyTo method was racing against state updates, causing the map to not respond to location searches. We switched to map.setView() with manual dependency control in the useEffect to make it reliable. CSV parsing edge cases — Boolean values (true/false) in the CSV were being parsed as strings, silently breaking distance calculations and route safety filters. We added explicit type coercion in the parser. NWS API zone detection — The National Weather Service API doesn't accept raw coordinates for alerts — it requires a zone code. We solved this by chaining two API calls: first fetching the grid point for the user's coordinates to extract the state code, then fetching alerts for that state. We added a national fallback for locations that don't resolve cleanly. USGS bounding box queries — USGS gauge data near a location requires a geographic bounding box query, and the response includes both water level and flow rate readings mixed together. We filtered and sorted by water level descending to surface the highest-risk gauges first.
Accomplishments that we're proud of We're proud that CrisisIQ actually works end-to-end with real live data. The NWS and USGS integrations pull genuine government sensor readings in real time — the flood alerts and river gauge levels you see in the coordinator dashboard are not mocked. The dual-stakeholder architecture is something we're especially proud of. Most disaster tools either serve coordinators or civilians — rarely both. The feedback loop where a civilian's SOS automatically surfaces as a critical alert in the coordinator view makes the system feel alive and genuinely useful. We're also proud of the AI SITREP feature. Feeding live NWS alert data and USGS gauge readings directly into the Claude prompt means the situation report references actual current conditions — not generic templates. A real emergency manager could read that output and act on it. Building a production-feeling application with a full map, live APIs, AI integration, and two complete user flows in under 21 days as a solo/small team submission is an accomplishment we're genuinely proud of.
What we learned We learned that the hardest part of building for emergencies isn't the AI — it's the data plumbing. Getting real-time government APIs to cooperate, handling CORS restrictions in a browser environment, and making sure location data flows correctly through geocoding → coordinates → map → AI prompt taught us more about systems thinking than any tutorial. We also learned that Claude's context window is a superpower in disaster scenarios. The ability to pass live flood alert text, gauge readings, resource counts, and user location into a single prompt and get a coherent, actionable SITREP back is genuinely something that wasn't possible before large language models. Most importantly we learned that good UX under stress looks very different from normal UX. Every interaction in SafeReach had to be reduced to the minimum possible steps — someone in a flooded house doesn't have time to navigate menus. That constraint made us better designers.
What's next for CrisisIQ Real shelter data integration — Partner with FEMA's API and the American Red Cross shelter registry to replace the static CSV with live shelter availability data that updates automatically during declared disasters. Offline / SMS mode — Build a Twilio integration so civilians without internet can text their zip code to a shortcode and receive the nearest 3 open locations as a plain text reply. Critical for areas where cell towers are overloaded. Multi-agency dispatch — Give coordinators the ability to assign rescue teams to specific SOS events directly from the dashboard, with Claude generating the dispatch order and tracking ETA. Predictive surge modeling — Use Claude to project hospital capacity saturation 2-4 hours ahead based on current intake rates and incoming SOS density, so coordinators can pre-route patients before hospitals hit capacity. Mobile app — Convert SafeReach into a native React Native app with push notifications for flood alerts in the user's area, even when the app is closed. Multilingual support — Claude already handles 40+ languages in its responses. Wrapping the full UI in i18n and auto-detecting browser language would make SafeReach accessible to non-English speakers — a critical gap in current disaster tools. Government pilot — Apply for FEMA's BRIC grant program and reach out to Indianapolis Emergency Management for a tabletop exercise pilot, using CrisisIQ as the coordination layer for a simulated flood scenario.
Built With
- api.weather.gov/alerts
- react
- react-leaflet
- vite
Log in or sign up for Devpost to join the conversation.