π View on Devpost
AidLink is a real-time crisis coordination platform that monitors X (Twitter) for structural collapse reports in active conflict zones, verifies them using a multi-agent AI pipeline, and surfaces actionable incident data to local response coordinators. Built at ProduHacks 2026.
π₯ 1st Place β Best Use of Fetch AI
$400 CAD prize + Guaranteed Internship Interviews
Guaranteed Admission to Spring VC's Build Accelerator Cohort + LMS Access
-
π Scraper β Playwright scrapes X every hour using Gemini-generated search queries in Arabic, Ukrainian, Farsi, and English. Gemini filters noise, clusters tweets into discrete incidents, and extracts location, casualty estimates, and criticality.
-
π€ Upload β Fresh incidents are upserted to Supabase (PostgreSQL), preserving full timestamped history.
-
π€ Fetch.ai Agent Pipeline β Three uAgents registered on Agentverse run automatically after each scrape:
- Analyst Agent β fetches post content and scores each incident for reliability using ASI:One
- Critic Agent β independently challenges the analyst's verdict; produces a
confirmed,disputed, orunreliablefinal verdict - Coordinator Agent β synthesises all verdicts into a per-region resource allocation brief
-
πΊοΈ Frontend β Coordinators see a live map of verified incidents with criticality tiers, casualty estimates, manpower needs, and AI-generated deployment recommendations.
- πΊοΈ Live crisis map β Leaflet map with Gaza and Ukraine views, incident markers color-coded by criticality, and an open incidents panel
- π Incident drawer β Full incident detail: reliability verdict, analyst/critic scores, casualty and manpower estimates, source posts, and media
- π§βπΌ Organizer dashboard β Verify incidents, assign volunteers, advance assignment statuses, check-in modal, and resolve incidents
- π AI reliability scoring β Every incident gets an independent analyst + critic score; disputed or unreliable incidents are flagged, not hidden
- π Regional allocation briefs β Coordinator agent produces a per-region summary: overall state, priority incident ordering, concrete resource recommendations, and external support needs
- β° Hourly automation β Scheduler runs scrape β upload β analysis pipeline automatically
Frontend
- Next.js 14 (App Router)
- TypeScript
- Tailwind CSS
- shadcn/ui components
- React Hook Form + Zod
- Leaflet / react-leaflet
- Zustand
- Prisma + SQLite
Backend
- Python
- Playwright (X scraping)
- Google Gemini 2.5 Flash (query generation, tweet clustering, incident extraction)
- Fetch.ai uAgents + Agentverse (multi-agent reliability pipeline)
- ASI:One / asi1-mini (analyst and critic LLM)
- Supabase (PostgreSQL β incidents, analyses, region reports)
npm install
# Configure database
# Ensure .env contains: DATABASE_URL="file:./dev.db"
npx prisma generate
npx prisma db push
npm run db:seed # seeds Gaza-area incidents, volunteers, sample reports
npm run devOpen http://localhost:3000.
pip install playwright uagents uagents-core openai supabase google-genai python-dotenv httpx
playwright install chromiumThe scraper uses your X session cookies to access search results. Run the cookie saver once before your first scrape:
python save_cookies.pyThis opens a Chromium browser window. Log into X manually, then close the browser. Your session is saved to x_cookies.json and reused automatically by the scraper.
If you have a second X account added to the same browser session, the scraper will automatically attempt to switch to it if a page fails to load β reducing the chance of rate limiting mid-scrape.
β οΈ Cookies expire periodically β if the scraper logs showβ Cookies expired, log out of X and re-runsave_cookies.py.
Create a .env file:
# Gemini
GEMINI_API_KEY=...
# ASI:One β https://asi1.ai/dashboard/api-keys
ASI_ONE_API_KEY=...
# Supabase
SUPABASE_URL=https://xxxx.supabase.co
SUPABASE_SERVICE_KEY=eyJ...
# Agent seeds (set once, never change)
ANALYST_SEED=...
CRITIC_SEED=...
COORDINATOR_SEED=...
# Agent addresses (fill after first run)
ANALYST_ADDRESS=agent1q...
CRITIC_ADDRESS=agent1q...
COORDINATOR_ADDRESS=agent1q...
# Scheduler interval (default 60)
INTERVAL_MINUTES=60
Run the Supabase schema files in order via the SQL editor:
supabase_schema.sqlsupabase_agents_schema.sql
Start the two persistent agents:
# Terminal 1
python critic.py
# Terminal 2
python coordinator.pyStart the scheduler (manages the analyst automatically):
# Terminal 3
python scheduler.pyThe scheduler runs main.py (scrape) β upload_to_supabase.py (sync) β restarts analyst.py (analysis pipeline) every hour.
On first run, each agent prints its address:
INFO: Analyst agent address: agent1q...
Copy each address into .env, then click the inspector link printed in each terminal β Connect β Mailbox to register on Agentverse. Restart all agents once addresses are set.
- Landing β View Crisis Map or Organizer Dashboard
- Public map β Switch between Gaza and Ukraine views β Click a marker β Read reliability verdict, casualty estimates, source posts
- Organizer β Log in as Organizer (demo) β Select incident β Verify, edit, assign volunteers, check in, or resolve
- Incident detail β
/map/incident/[id]β Time urgency tier, situation summary, inbound reports
app/
page.tsx # Landing
map/page.tsx # Public crisis map
map/incident/[id]/page.tsx # Incident detail
dashboard/page.tsx # Organizer map
api/ # REST handlers
components/
GazaCrisisMap.tsx
OrganizerMap.tsx
MapIncidentDrawer.tsx
OrganizerIncidentDrawer.tsx
...
data/
incidents.json # Optional JSON override for map
backend/
main.py # Scraper (Gemini + Playwright)
upload_to_supabase.py # Supabase sync
analyst.py # Fetch.ai analyst agent
critic.py # Fetch.ai critic agent
coordinator.py # Fetch.ai coordinator agent
scheduler.py # Hourly automation
supabase_schema.sql
supabase_agents_schema.sql
prisma/
schema.prisma
seed.ts
Assignments should be reviewed by qualified coordinators. Do not enter unsafe zones without authorization or proper training.
Proprietary β All Rights Reserved. See LICENSE for details.
Built by Jasper He, Leo Wu, Ethan Hoang, Daniel Zou β ProduHacks 2026
