I'm an HPC consultant and computational scientist working at the intersection of AI/ML, high-performance computing, and Earth system science 🌎. I am particularly interested in improving weather and climate forecasting models using AI, deep learning, and GPU-accelerated computing. I specialize in developing, scaling, and optimizing distributed training and inference workflows on GPU-accelerated HPC clusters, with a focus on improving weather and climate forecasting through AI/ML.
I currently work at the Computational and Information Systems Laboratory at the National Center for Atmospheric Research (NSF-NCAR), where I help build the AI/ML cyber-infrastructure for weather and climate modeling, help researchers scale and optimize scientific AI workloads, and build scalable data pipelines for training large AI models.
I have a Ph.D. in Chemical Engineering from the University of Iowa, where my thesis focused on performance analysis and optimization of weather and air quality models. Nowadays, I'm working on scaling AI/ML workflows on supercomputers using cutting-edge technologies for Earth system science, building community-driven infrastructure, and championing open science practices across the geosciences.
I am also an open-source contributor to Xarray, CuPy-Xarray, Zarr-Python, WRF, CESM/CTSM, and Project Pythia.
⚙️ Architecting and optimizing distributed multi-node, multi-GPU training AI/ML workflows on NCAR's supercomputers
📊 Building scalable GPU-native data pipelines for petabyte-scale Earth system datasets
🌱 Contributing to the Pangeo ecosystem and teaching scalable geospatial data analysis at SciPy, ESDS, and NCAR workshops
💬 Ask me about AI/ML for weather and climate, optimizing AI workflows, distributed training on HPC, and scalable geospatial data workflows
📫 Find me on LinkedIn





