Welcome to my homepage
I’m a Senior Research Scientist on the Nvidia Nemotron Research team, focused on pushing the limits of code generation through fine-tuning and optimization of large language models. I was part of AWS AI Labs, where I helped build code generation capabilities for Amazon Q Developer.
I obtained my PhD in Computer Science at University of California Los Angeles, supervised by Dr. Kai-Wei Chang. I was fortunate to work as a research intern at Meta AI, Yahoo Research, Microsoft Research, and Walmart Labs during my PhD.
News and Announcements
- [12.2025] We released Nemotron 3 Family of Models.
- [11.2025] Presented tutorial “NLP+Code: Code Intelligence in Language Models” at EMNLP 2025.
- [10.2025] We introduce GenCluster, achieving IOI Gold with open-weight LLMs.
- [10.2025] We released BigCodeArena, check this out!
- [08.2025] We released Nemotron-Nano-v2.
- [06.2025] Co-organizing Deep Learning for Code workshop at NeurIPS 2025.
- [04.2025] We released Nemotron-H, a family of Mamba-Transformer models.
- [04.2025] We released OpenCodeInstruct and OpenCodeReasoning.
- [03.2025] I will serve as a senior area chair for EMNLP and IJCNLP-AACL 2025.
