About
Hi, I’m a graduate student at KAIST, specializing in natural language processing (NLP) and large language models (LLMs), advised by Prof. Minjoon Seo. I am also a research intern at Microsoft Research, where I work with Yeyun Gong, Lei Ji, and Qi Chen. I was honored with the Stars of Tomorrow award at Microsoft Research Asia.
Previously, I worked as a research scientist at Samsung Research and ESTsoft, contributing to applied LLM systems for Galaxy AI. I conducted research on foundation models, primarily on LLM pre-/post-training, information retrieval, and semantic parsing.
My research interests lie in making large language models more efficient and practical:
- Efficient training and inference methods for LLMs, including cost-efficient inference (NAACL 2025), distillation (ICML 2025, NAACL 2025), and data mixture optimization (DynamixSFT)
- Real-world impact, advancing agent adaptability (NAACL 2025), scientific discovery (GenBio 2025, BioNLP 2025), and real-time information access (NAACL 2024).
Work Experience



Selected Publications
Overcoming Vocabulary Mismatch: Vocabulary-agnostic Teacher Guided Language Modeling
ICML 2025
The BiGGen Bench: A Principled Benchmark for Fine-grained Evaluation of Language Models with Language Models
NAACL 2025 Best Paper
InstructIR: A Benchmark for Instruction Following of Information Retrieval Models
ACL 2024 KnowledgeNLP workshop