Summary

Last updated: 2022-08-29

Hi! I'm a second year PhD student in CSE at the University of Michigan, Ann Arbor. My research interest lies in the intersection of software systems and deep learning, with a recent focus on energy consumption. I lead the ML Energy initiative. I am fortunate to have Professor Mosharaf Chowdhury as my advisor.

Find the PDF version of my CV here.

Publications

Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training

Jie You*, Jae-Won Chung*, Mosharaf Chowdhury (* equal contribution)
20th USENIX Symposium on Networked Systems Design and Implementation (NSDI), 2023
Website, DOI, arXiv, PDF, YouTube, Slides

ShadowTutor: Distributed Partial Distillation for Mobile Video DNN Inference

Jae-Won Chung, Jae-Yun Kim, and Soo-Mook Moon
In ICPP (Acceptance Rate 29%=78/269), Edmonton, CA, 2020.
DOI, arXiv, PDF, YouTube, Slides

Experience

Zeus: Energy-Efficient DNN training on GPUs

Sep 2021 - Apr 2022

• Promoting energy as a first-class resource in DNN training on GPUs.
• Observation and optimization of GPU energy consumption.

Crane: A GPU Cluster Resource Manager for Elastic AutoML

Mar 2020 - Jul 2021

• Worked with Professor Byung-Gon Chun.
• Developed Crane, an GPU cluster resource manager for elastic AutoML workloads.
• Kubernetes backend for Crane, efficient AutoML scheduling for GPU clusters.
• Took charge of bootstrapping and mentoring newer Crane team members (interns and graduate students).

ShadowTutor: Server-client collaborative DNN inference

Dec 2019 - Jun 2020

• Worked with Professor Soo-Mook Moon.
• A novel server-client collaborative video DNN inference method that drastically reduces network traffic via intermittent knowledge distillation.
• Implemented with PyTorch & OpenMPI, evaluated using an NVIDIA Jetson Nano board as a client.

Meta-learning, Few-shot Classification

Jun 2019 - Dec 2019

• Worked with Professor Kyoung Mu Lee.
• Better meta-initialization points for Model-Agnostic Meta-Learning (MAML) using an LSTM-based neural memory.
• Augmenting the feature maps of MAML with task-aware class embeddings generated with a convex program (DPP).

Deep Learning for Quantitative Susceptibility Mapping (QSM)

Jun 2019 - Aug 2019

• Worked with Professor Jongho Lee.
• Designed a U-Net variant and trained it on augmented MRI data.
• Participated in the QSM challenge held by the 5th International Workshop on MRI Phase Contrast and QSM.

Honors and Awards

Kwanjeong Overseas Scholarship

Jul 2021

Kwanjeong Educational Foundation

Four years, $25,000 per year

Kwanjeong Undergraduate Scholarship

Mar 2019

Kwanjeong Educational Foundation

Two years, $10,000 per year

Extracurricular Activity

Member

Dec 2018 - Present

A free research group on all domains of deep learning

• Gained experience extensively in computer vision and meta-learning, and attended talks on computer vision, natural language processing, reinforcement learning, and speech recognition.
• Gave a talk with the title "Memory plus Meta-Learning".

Language Coordinator

Oct 2018 - Mar 2021

An official Coursera community that traslates Coursera lecture subtitles

• Served as Language Coordinator, a selected position that reviews and confirms works by other translators.
• Created Korean subtitles for Coursera Lectures initially provided only in English. Focused on courses related to machine learning.

Teaching

Operating Systems

Spring 2021

Main TA
Lectured on Linux kernels, managed term projects, and led group design reviews.

Computer Organization (Undergraduate architecture)

Fall 2020

Peer tutor
Provided 30 hours of online lecture, Best Tutor Award!

Skills & Proficiency

Python, PyTorch

Rust, CUDA, Verilog

C++, MATLAB

Go, JavaScript

Open Source Projects

Pegasus: A Lightweight Multi-Node Parametrized Command Runner

"I really don't want to run all these experiment commands on 20 nodes manually."

Reason: A Shell for Research Papers

"Did I ever read this paper? Which papers have the word 'training' in their titles?"