Skip to content

msradam/our-era

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

OUR-ERA Emergency Routing Assistant - Deployment Package

Climate-aware pedestrian routing for Brownsville, Brooklyn.

Quick Start

1. Install Dependencies

pip install -r requirements.txt

2. Start Ollama (for LLM support)

The app uses a local LLM via Ollama. Install Ollama and pull a model:

# Install Ollama: https://ollama.com/download
ollama pull qwen2.5:3b

3. Run the App

streamlit run app.py

The app will open at http://localhost:8501.

Contents

deployment_package/
β”œβ”€β”€ app.py                          # Main Streamlit application
β”œβ”€β”€ requirements.txt                # Python dependencies
β”œβ”€β”€ README.md                       # This file
β”œβ”€β”€ core/
β”‚   β”œβ”€β”€ __init__.py                # Module exports
β”‚   β”œβ”€β”€ config.py                  # UI configuration
β”‚   β”œβ”€β”€ engine.py                  # Routing engine (climate-aware weights)
β”‚   └── tools.py                   # Routing tools and geocoding
└── data/
    β”œβ”€β”€ tool_embeddings.npz        # Pre-computed embeddings for tool selection
    └── brownsville/
        β”œβ”€β”€ graph_cache.pkl         # Cached graph (fast loading)
        β”œβ”€β”€ walking_network_final.graphml  # Full graph with climate attributes
        β”œβ”€β”€ all_resources.csv       # POI database (cooling centers, hospitals, etc.)
        β”œβ”€β”€ places.csv              # Geocoding database for place names
        └── pois_metadata.json      # POI type definitions

Features

  • Climate-Aware Routing: Routes avoid flood zones and high heat areas
  • Accessibility Support: Grade factor penalizes steep hills for mobility-impaired users
  • Natural Language Interface: Ask questions like "Find the nearest pharmacy" or "Route to Brookdale Hospital avoiding flooding"
  • Multiple Route Alternatives: Shows shortest, flattest, and safest routes

Climate Weight Formula

weight = length * flood_mult * heat_mult * shade_mult * aqi_mult * grade_mult

The LLM adjusts these parameters based on user context:

  • Flooding mentioned: flood_penalty_deep=10.0
  • Heat/shade needed: heat_factor=0.5, shade_factor=0.5
  • Elderly/wheelchair/mobility: grade_factor=0.5
  • Respiratory concerns: aqi_factor=0.5

Requirements

  • Python 3.10+
  • Ollama running locally with a compatible model
  • ~15MB disk space for data files

Troubleshooting

"Could not connect to Ollama": Make sure Ollama is running (ollama serve)

Model not found: Pull the model first (ollama pull qwen2.5:3b)

Import errors: Ensure all dependencies are installed (pip install -r requirements.txt)

About

🌍 LEAP Climate Hackathon 2026 - Team Megalodons

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages