About Me

Hi, I'm Junhoo Lee. I am a Ph.D. candidate at Seoul National University (MIPAL), advised by Prof. Nojun Kwak.

My research aims to bridge the gap between optimization theory and modern generative AI. Instead of merely scaling up models, I investigate the training dynamics of overparameterized networks and design inductive biases (such as geometric constraints or explicit filtering) to ensure robust in-distribution learning.

Currently, I am exploring the fundamental principles of Diffusion Models and LLMs to make them more efficient, explainable, and controllable.

I am always open to discussing new ideas and potential collaborations. Feel free to reach out to me via email at mrjunoo@snu.ac.kr.

Education

Ph.D. Candidate in Intelligence and InformationSeoul National University
Sep 2021 – Aug 2026 (Expected)
B.Sc. in Electrical and Computer EngineeringSeoul National University
Mar 2017 – Sep 2021

News

Preprint

Unlocking the Potential of Diffusion Language Models through Template Infilling

PreprintLarge Language ModelsDiffusion Language Models
Junhoo Lee, Seungyeon Kim, Nojun Kwak

Unlike autoregressive LMs, diffusion LMs work better with template-then-fill rather than sequential prompting.

Main Conference

Deep Edge Filter †

NeurIPS 2025Learning TheoryRepresentation Analysis
Dongkwan Lee†, Junhoo Lee†, Nojun Kwak

Just as humans perceive edges (high-frequency) as core components, deep features in neural networks exhibit the same tendency.

What's Making That Sound Right Now?

ICCV 2025Generative ModelsAudio-Visual Localization
Hahyeon Choi, Junhoo Lee, Nojun Kwak

Video-centric audio-visual localization benchmark (AVATAR) with temporal dynamics.

Deep Support Vectors

NeurIPS 2024Learning TheoryIsometry
Junhoo Lee, Hyunho Lee, Kyomin Hwang, Nojun Kwak

Deep learning has support vectors just like SVMs.

Any-Way Meta Learning

AAAI 2024Meta-LearningFew-Shot Learning
Junhoo Lee, Yearim Kim, Hyunho Lee, Nojun Kwak

Breaking fixed N-way constraint in meta-learning by exploiting label equivalence from episodic task sampling.

SHOT: Suppressing the Hessian along the Optimization Trajectory

NeurIPS 2023Meta-LearningOptimization
Junhoo Lee, Jayeon Yoo, Nojun Kwak

The key to meta-learning adaptation is flattening the learning trajectory.

Workshop

Do Not Think About Pink Elephant! †

CVPR 2024 Workshop (Responsible Generative AI)Generative ModelsSafety
Kyomin Hwang†, Suyoung Kim†, Junhoo Lee†, Nojun Kwak

First discovery that negation doesn't work in large models — telling them not to generate something makes them generate it.

Coreset Selection for Object Detection

CVPR 2024 Workshop (Dataset Distillation)Data EfficiencyDataset Pruning
Hojun Lee, Suyoung Kim, Junhoo Lee, Jaeyoung Yoo, Nojun Kwak

Efficient coreset selection method specifically designed for object detection tasks.

Practical Dataset Distillation Based on Deep Support Vectors

ECCV 2024 Workshop (Dataset Distillation Challenge)Data EfficiencyDataset Distillation
Hyunho Lee, Junhoo Lee, Nojun Kwak

Applying DeepKKT loss for dataset distillation when only partial data is accessible.

Journal

The Role of Teacher Calibration in Knowledge Distillation

IEEE AccessKnowledge DistillationCalibration
Suyoung Kim, Seonguk Park, Junhoo Lee, Nojun Kwak

Teacher's calibration error strongly correlates with student accuracy — well-calibrated teachers transfer knowledge better.

Awards & Honors

2023
BK21 Future Innovation Talent Bronze Prize(KRW 1,000,000)
2023
BK21 Outstanding Research Talent Fellowship(KRW 3,500,000)
2022
Yulchon AI Star Scholarship(KRW 8,000,000)
2021
3rd Place, SNU FastMRI Challenge (out of 107 teams)(KRW 3,000,000)
2021
Kwanak Scholarship(KRW 4,000,000)
2017
National Science and Engineering Scholarship(KRW 3,000,000 per semester)