Summary
I'm a fourth year PhD candidate in CSE at the University of Michigan. I build efficient software systems for deep learning, with a recent focus on the efficient management of not only time, but also energy.
I view energy as a new first-class systems resource. I am particularly interested in understanding how energy is different from other resources and building software systems that can reduce energy in a manner orthogonal to hardware advancements.
I lead the ML.ENERGY initiative. I am fortunate to be advised by Professor Mosharaf Chowdhury and be part of SymbioticLab.
Publications
Perseus: Reducing Energy Bloat in Large Model Training
SOSP, 2024 (Acceptance rate = 17.34%)
Toward Cross-Layer Energy Optimizations in AI Systems
DOE ASCR Energy-Efficient Computing for Science Workshop, 2024
Andes: Defining and Enhancing Quality-of-Experience in LLM-Based Text Streaming Services
Preprint, 2024
Chasing Low‑Carbon Electricity for Practical and Sustainable DNN Training
ICLR Workshop (Tackling Climate Change with Machine Learning), 2023
Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training
USENIX NSDI, 2023 (Acceptance rate = 18.38%)
ShadowTutor: Distributed Partial Distillation for Mobile Video DNN Inference
ACM ICPP, 2020 (Acceptance rate = 28.99%)
Experience
Graduate Student Research Assistant
Advisor: Prof. Mosharaf Chowdhury
Building energy-efficient software systems for machine learning. I created Zeus, the first energy optimization system for DNN training on GPUs. Zeus is a PyTorch ecosystem project and serves as the bedrock for Chase, a carbon-efficient DNN training solution, the ML.ENERGY Leaderboard, the first energy benchmark for LLM inference, the ML.ENERGY Colosseum, an interactive service that lets users compare LLM responses in terms of both quality and energy consumption, and Perseus, a large model training energy optimizer that reduces per-iteration energy consumption by up to 30% without training slowdown.
Keywords:
- MLSys
- Energy
- LLM
- Training
- Inference
- Open-Source
Research Intern
Advisor: Prof. Byung-Gon Chun
Developed Crane, a GPU cluster manager for elastic AutoML jobs. Wrote components for automatic cluster bootstrapping on Docker Swarm and enabled full operation on top of Kubernetes. Worked on efficient AutoML scheduling policies on GPU clusters.
Keywords:
- MLSys
- AutoML
- Training
- Cluster Management
- Scheduling
- Open-Source
Research Intern
Advisor: Prof. Soo-Mook Moon
Created ShadowTutor, a server-client collaborative DNN inference system that distills knowledge from a server-side large DNN to a small DNN on the client in an online fashion.
Keywords:
- MLSys
- Inference
- Knowledge Distillation
Research Intern
Advisor: Prof. Kyoung Mu Lee
Worked on finding better meta-initialization points for Model-Agnostic Meta-Learning (MAML) using LSTM-based neural memory modules. Also worked on embedding images of the same class into a single class embedding vector and augmenting MAML with self-attention scores derived from class embeddings.
Keywords:
- ML
- Computer Vision
- Meta-Learning
- Few-Shot Classification
- Optimization
Research Intern
Advisor: Prof. Jongho Lee
Designed and implemented CAD-QSMNet, a full deep learning pipeline for Quantitative Susceptibility Mapping (QSM) for brain MRI images, including a new U-Net variant model.
Keywords:
- ML
- Computer Vision
- Medical Imaging
- Data Engineering
Open-Source Projects
Talks
- Open-Source
- |
- Slides (Continuously updated)
- Energy
- MLSys
- Energy
- MLSys
- Energy
- MLSys
- ML
- Meta-Learning
- |
- Slides
Education
-
PhD, Computer Science and Engineering(In progress)University of MichiganSep 2021 - Present
-
MS, Computer Science and EngineeringUniversity of MichiganSep 2021 - Apr 2023
-
BS, Electrical and Computer EngineeringSumma cum laudeSeoul National University, South KoreaMar 2015 - Aug 2021
Proficiency
Languages
-
Python
-
Rust
-
Go, C++, CUDA, Verilog
-
Zig, JavaScript
Tools and Frameworks
-
FastAPI, Mkdocs, Pandas, NumPy
-
PyTorch, Kubernetes, LaTeX
Others
- Commandline
- Neovim
- GitHub
- Open-Source
- Documentation
Honors & Awards
-
Second Best Solution in Carbon Hack '22$25,000 prize with Chase.
-
Kwanjeong Overseas Scholarship$25,000 awarded.
-
Best Tutor AwardSNU computer architecture, Fall 2020.
-
Kwanjeong Undergraduate Scholarship$20,000 awarded over two years.
Teaching
-
Undergrad Operating SystemsAs lead TA, provided Linux kernel lectures, four Linux-based term projects, and team design reviews.Spring 2021
-
Undergrad Computer ArchitectureGave 30 hours of online lecture as peer tutor. Best tutor award!Fall 2020
Community Service
-
OrganizerSystems Reading Group in UMich CSESep 2022 - Present
-
Language CoordinatorCoursera Global Translator CommunityTranslated ML lectures to Korean and coordinated translator interactions.Oct 2018 - Mar 2021
English Proficiency
- TOEFL 120/120 (2020)
- GRE 167/170/4.5 (2018)
Interests
- Software Systems
- Deep Learning
- Fingerstyle Guitar