Inspiration
We were passionate about creating large-scale impact for sustainability. We observed that organizations failed to optimize their limited funds since climate adaption is uncertain. Climate adaptation is full of hard decisions like where risk is growing, which communities are most exposed, and how limited resources should be allocated. We saw this is an increasingly proliferating issue, that was worth solving.
Besides, being passionate about solving the crucial issue of climate change, we were also curious of using embedding models for videos, not just text. A lot of climate and simulation work ends up in visual outputs that are hard to search through. That pushed us to build a project where generated simulation videos are part of the intelligence layer. Ultimately, we aimed to reach a layman audience of non-experts, which is exactly what SenTree's decision-support experience and user-focused design allowed us to do.
What it does
SenTree is an end-to-end climate risk intelligence system designed to drive better sustainability investing decisions. It tells environmental and government organizations where to invest and how much it is worth. It has multiple functions:
- It integrates research papers to detect tipping points - the ones that are the most vulnerable. These are normalized per Koppen-Geiger climate class so a desert node and a rainforest node aren't penalized on the same raw scale.
- It calculates Resilience ROI with uncertainty. Returns are discounted over a 10-year horizon and adjusted for precipitation ensemble uncertainty
- It makes simulations searchable in plain English. Outputs are rendered as MP4 heatmaps and embedded into a 768-dimensional vector space using Gemini 1.5 Pro native video embeddings
- It propagates risk through a climate graph neural network. Grid cells become graph nodes connected by geographic proximity.
- Implements 26 climate interventions and predicts exactly how each one changes the risk map. But has integrated logic, so it won't recommend a mangrove project in a desert. Every intervention is filtered by the local climate classification, so you only see options that are physically viable for that location.
How we built it
It ingests real temperature and precipitation climate data from ISIMIP3b, stimulated on GDFL-ESM4 and extracted temperature and precipitation features from 2015 - 2100.
Then, Kopper-Geiger climate classification is computer per grid cell and along with other temporal, climatical, and physical features is embedded as a vector inside the Graph Neural Network. Simulation outputs are rendered as MP4s via matplotlib and ffmpeg, chunked by SentrySearch, and indexed into ChromaDB. The full dashboard is built in Streamlit with sidebar navigation, live ROI recomputation, pydeck interactive maps, and a frame-by-frame GNN training playback.
26 climate interventions are implemented, and using Koppen-Geiger classifications to filter, we build a recommendation system that calculates Investor Scores based on ROI resilience and Loss Avoided for interventions.
Challenges we ran into
Early on we ran into issues into setting up the correct project environment. The system had multiple learning libraries, visualization tools, frontend tooling and video processing frameworks. So most of our initial work went into installing the right dependencies so that everything could work together perfectly.
We had issues with calculating risks; initially, the tail risk malfunctioned and created strong variations over time, which was unrealistic. There were multiple issues with RCAC cluster and GPU assignment, since none of knew slurm too well. Automation made this all easier once we got the hang of it though.
We couldn't get out of the swamp that was poor time management, git merge conflicts, and dumb AI models and/or their harnesses constantly sabotaging us.
Another challenge was performance. Streamlit worked well to build the dashboard, but the interactive GNN playback component was too slow when rendered frame by frame in the app.
Furthermore, the video rendering pipeline took a long time since the pipeline effectively ran on the CPU and was not able to benefit from the GPU acceleration capabilities (which only the GNN could use) due to library limitations. For eg. NumPy is CPU only.
Log in or sign up for Devpost to join the conversation.