About Me

Hi, I'm Junhoo Lee. I am a PhD student at Seoul National University (MIPAL), advised by Prof. Nojun Kwak.

My primary research interests lie in the intersection of Diffusion Models, Large Language Models (LLMs),Machine Learning Theory, and Lifelong Learning. I am passionate about understanding the fundamental principles of generative models and applying them to solve complex problems.

I am always open to discussing new ideas and potential collaborations. Feel free to reach out to me via email at mrjunoo@snu.ac.kr.

News

Preprint

Unlocking the Potential of Diffusion Language Models through Template Infilling

Junhoo Lee, Seungyeon Kim, Nojun Kwak
PreprintDiffusion Language Model

Unlike autoregressive LMs, diffusion LMs work better with template-then-fill rather than sequential prompting.

Main Conference (First Author / Co-first †)

Deep Edge Filter †

Dongkwan Lee†, Junhoo Lee†, Nojun Kwak
NeurIPS 2025Deep Learning Theory

Just as humans perceive edges (high-frequency) as core components, deep features in neural networks exhibit the same tendency.

What's Making That Sound Right Now?

Hahyeon Choi, Junhoo Lee, Nojun Kwak
ICCV 2025Audio-Visual Localization

Video-centric audio-visual localization benchmark (AVATAR) with temporal dynamics.

Deep Support Vectors

Junhoo Lee, Hyunho Lee, Kyomin Hwang, Nojun Kwak
NeurIPS 2024Deep Learning Theory

Deep learning has support vectors just like SVMs.

Any-Way Meta Learning

Junhoo Lee, Yearim Kim, Hyunho Lee, Nojun Kwak
AAAI 2024Meta-Learning

Breaking fixed N-way constraint in meta-learning by exploiting label equivalence from episodic task sampling.

SHOT: Suppressing the Hessian along the Optimization Trajectory

Junhoo Lee, Jayeon Yoo, Nojun Kwak
NeurIPS 2023Meta-Learning

The key to meta-learning adaptation is flattening the learning trajectory.

Workshop

Do Not Think About Pink Elephant! †

Kyomin Hwang†, Suyoung Kim†, Junhoo Lee†, Nojun Kwak
CVPR 2024 Workshop (Responsible Generative AI)Text-to-Image Generation

First discovery that negation doesn't work in large models — telling them not to generate something makes them generate it.

Coreset Selection for Object Detection

Hojun Lee, Suyoung Kim, Junhoo Lee, Jaeyoung Yoo, Nojun Kwak
CVPR 2024 Workshop (Dataset Distillation)Coreset Selection

Efficient coreset selection method specifically designed for object detection tasks.

Practical Dataset Distillation Based on Deep Support Vectors

Hyunho Lee, Junhoo Lee, Nojun Kwak
ECCV 2024 Workshop (Dataset Distillation Challenge)Dataset Distillation

Applying DeepKKT loss for dataset distillation when only partial data is accessible.

Journal

The Role of Teacher Calibration in Knowledge Distillation

Suyoung Kim, Seonguk Park, Junhoo Lee, Nojun Kwak
IEEE AccessKnowledge Distillation

Teacher's calibration error strongly correlates with student accuracy — well-calibrated teachers transfer knowledge better.