About Me

I am a first-year Ph.D. student in Computer Science at the University of California, Berkeley, advised by Profs. Jason D. Lee and Song Mei. My interests lie in the mathematical foundations of deep learning, with a focus on nonconvex optimization, dynamical analysis, and computational-statistical guarantees for neural networks. I am also interested in understanding emergent capabilities of foundation models such as chain-of-thought reasoning.

Previously, I received my B.Sc. in Mathematics and Statistics at Seoul National University as Valedictorian of ‘23, and my M.Sc. in Mathematical Informatics at the University of Tokyo advised by Prof. Taiji Suzuki, where I received the Dean’s Award for outstanding research.

Publications

Sharp Capacity Scaling of Spectral Optimizers in Learning Associative Memory
Juno Kim*, Eshaan Nichani*, Denny Wu, Alberto Bietti, Jason D. Lee. Under review.
Coverage Improvement and Fast Convergence of On-policy Preference Learning
Juno Kim, Jihun Yun, Jason D. Lee, Kwang-Sung Jun. Under review.
Alignment as Distribution Learning: Your Preference Model is Explicitly a Language Model
Jihun Yun*, Juno Kim*, Jongho Park, Junhyuck Kim, Jongha Jon Ryu, Jaewoong Cho, Kwang-Sung Jun. ICML 2025 MoFA Workshop.
Mirror Mean-Field Langevin Dynamics
Anming Gu*, Juno Kim*. Under review.
Hessian-guided Perturbed Wasserstein Gradient Flows for Escaping Saddle Points
Naoya Yamamoto, Juno Kim, Taiji Suzuki. NeurIPS 2025.
Optimality and Adaptivity of Deep Neural Features for Instrumental Variable Regression
Juno Kim, Dimitri Meunier, Arthur Gretton, Taiji Suzuki, Zhu Li. ICLR 2025.
Transformers are Minimax Optimal Nonparametric In-Context Learners
Juno Kim, Tai Nakamaki, Taiji Suzuki. NeurIPS 2024 and ICML 2024 TF2M Workshop, Best Paper Award.
Symmetric Mean-field Langevin Dynamics for Distributional Minimax Problems
Juno Kim, Kakei Yamamoto, Kazusato Oko, Zhuoran Yang, Taiji Suzuki. ICLR 2024 Spotlight.
t3-Variational Autoencoder: Learning Heavy-tailed Data with Student's t and Power Divergence
Juno Kim*, Jaehyuk Kwon*, Mincheol Cho*, Hyunjong Lee, Joong-Ho Won. ICLR 2024.
Hessian Based Smoothing Splines for Manifold Learning
Juno Kim, Otto van Koert. Preprint.
Reeb Flows without Simple Global Surfaces of Section
Juno Kim, Yonghwan Kim, Otto van Koert. Involve, 15(5), pp. 813–842, 2022. (*equal contribution)

Education

University of California, Berkeley

2025 - current

Ph.D. student in EECS

University of Tokyo

2023 - 2025

M.S. in Mathematical Informatics
Thesis: Statistical and Dynamical Analysis of Transformers: In-Context Learning and Chain-of-Thought Reasoning

Seoul National University

2018 - 2023

B.S. in Statistics
B.S. in Mathematics
Graduated Valedictorian of the College of Natural Sciences (GPA 4.28/4.3)
Thesis: Token and Corpus Imputation in Statistical Language Modeling via Semantic Embeddings, Hessian Based Smoothing Splines for Manifold Learning

Awards

Dean's Award for Research Achievement, IST, University of Tokyo

2025

Doctoral Course (DC1) Research Fellowship, JSPS

declined

Japanese Government Scholarship

2023 - 2025

President Award, Highest Honors, Seoul National University

2023

President Award, Korean Statistical Society

2023

National Scholarship, Kwanjeong Educational Foundation

2020 - 2023

4th Place, Simon Marais Mathematics Competition

2020

Eminence Scholarship, Seoul National University

2018 - 2020

Gold Prize, College Mathematics Competition

2019

Experience

Reviewer

AISTATS'24, ICML'24, NeurIPS'24, ICLR'25 (Notable Reviewer), ICML'25, NeurIPS'25

KRAFTON AI

2025
Deep Learning Div. Core Research Team, Intern

RIKEN Center for Advanced Intelligence Project

2023 - 2025
Part-time Researcher

Seoul National University

2019 - 2020, 2022 - 2023
Undergraduate Research Intern

Travel

I love attending conferences, traveling to new places and meeting people who share my interests. My hobbies are weight training, recreational math and playing the guitar. I am fluent in Korean, English and Japanese and learning German and French.

2025

NeurIPS (San Diego, CA)

Dec 2 - 7

ICML (Vancouver, Canada)

July 13 - 19

COLT (Lyon, France)

June 30 – July 4

Invited talk: NLP Colloquium (Japan; online)

May 21

ICLR (Singapore)

Apr 23 - 29

Invited talk: Flatiron Institute (New York, NY)

Mar 11 - 15

FIMI Workshop (Okinawa, Japan)

Mar 1 - 2

Invited talk: Vector Institute (Toronto, Canada; online talk)

Jan 24

2024

NeurIPS (Vancouver, Canada)

Dec 9 - 16

Visiting student: Simons Institute (Berkeley, CA)

Nov 11 - Dec 9

IBIS Workshop (Saitama, Japan)

Nov 4 - 7

Japanese Joint Statistical Meeting (Tokyo, Japan)

Sep 1 - 5

Research visit: UCL Gatsby Computational Neuroscience Unit (London, UK)

Aug 5 - 19

ICML (Vienna, Austria)

Jul 21 - 27

ICLR (Vienna, Austria)

May 7 - 11

FIMI Workshop (Bristol, UK)

Mar 25 - 27

Tokyo Deep Learning Workshop (Tokyo, Japan)

Mar 18 - 20

Machine Learning Summer School (Okinawa, Japan)

Mar 4 - 15

2023

Visiting student: NYU Center for Data Science (New York, NY)

Dec 16 - 19

NeurIPS (New Orleans, LA)

Dec 10 - 16

IBIS Workshop (Kyushu, Japan)

Oct 29 - Nov 1

Japanese Joint Statistical Meeting (Kyoto, Japan)

Sep 3 - 7