Climate-aware pedestrian routing for Brownsville, Brooklyn.
pip install -r requirements.txtThe app uses a local LLM via Ollama. Install Ollama and pull a model:
# Install Ollama: https://ollama.com/download
ollama pull qwen2.5:3bstreamlit run app.pyThe app will open at http://localhost:8501.
deployment_package/
βββ app.py # Main Streamlit application
βββ requirements.txt # Python dependencies
βββ README.md # This file
βββ core/
β βββ __init__.py # Module exports
β βββ config.py # UI configuration
β βββ engine.py # Routing engine (climate-aware weights)
β βββ tools.py # Routing tools and geocoding
βββ data/
βββ tool_embeddings.npz # Pre-computed embeddings for tool selection
βββ brownsville/
βββ graph_cache.pkl # Cached graph (fast loading)
βββ walking_network_final.graphml # Full graph with climate attributes
βββ all_resources.csv # POI database (cooling centers, hospitals, etc.)
βββ places.csv # Geocoding database for place names
βββ pois_metadata.json # POI type definitions
- Climate-Aware Routing: Routes avoid flood zones and high heat areas
- Accessibility Support: Grade factor penalizes steep hills for mobility-impaired users
- Natural Language Interface: Ask questions like "Find the nearest pharmacy" or "Route to Brookdale Hospital avoiding flooding"
- Multiple Route Alternatives: Shows shortest, flattest, and safest routes
weight = length * flood_mult * heat_mult * shade_mult * aqi_mult * grade_mult
The LLM adjusts these parameters based on user context:
- Flooding mentioned:
flood_penalty_deep=10.0 - Heat/shade needed:
heat_factor=0.5, shade_factor=0.5 - Elderly/wheelchair/mobility:
grade_factor=0.5 - Respiratory concerns:
aqi_factor=0.5
- Python 3.10+
- Ollama running locally with a compatible model
- ~15MB disk space for data files
"Could not connect to Ollama": Make sure Ollama is running (ollama serve)
Model not found: Pull the model first (ollama pull qwen2.5:3b)
Import errors: Ensure all dependencies are installed (pip install -r requirements.txt)