I am an Assistant Professor in the Department of Electrical Engineering and Computer Science at DGIST. Previously, I was a postdoctoral researcher at the Siebel School of Computing and Data Science at the University of Illinois Urbana-Champaign, where I worked with Prof. Hanghang Tong. I received my Ph.D. in Computer Science and Engineering from Seoul National University, where I was advised by Prof. U Kang. I also received my B.S. in Mechanical and Aerospace Engineering from Seoul National University, with a double major in Computer Science and Engineering. My research focuses on data mining, query-aware data systems, streaming and adaptive computation, and applied AI. I have received best paper awards, including the Best Research Paper Award at KDD 2021 and the Best Paper Award Honorable Mention at ICDE 2022.

Research Interests

Data Mining Query-Aware Data Systems Streaming & Adaptive Computation Applied AI

Selected Publications

Data Mining
DuET: Dual-View Tensor-to-Topology Spectral Adapter for Enhancing Sparse Tensor Factorization
Jun-Gi Jang, Jingrui He, Andrew J Margenot, Hanghang Tong
KDD 2026, Jeju, South Korea  ·  (To Appear)

Data Mining
ProgNet: Program-Grounded Evidence Composition for Interpretable Graph Classification Minseok Jeon, Seunghyun Park, Jun-Gi Jang (Corresponding)
KDD 2026, Jeju, South Korea  ·  (To Appear)

Query-Aware Data Systems
TUCKET: A Tensor Time Series Data Structure for Efficient and Accurate Factor Analysis over Time Ranges
Ruizhong Qiu*, Jun-Gi Jang*, Xiao Lin, Lihui Liu, Hanghang Tong (* equal contribution)
VLDB 2025, London, United Kingdom  ·  [paper]  ·  [code]

Streaming & Adaptive Computation
Fast and Accurate Dual-Way Streaming PARAFAC2 for Irregular Tensors - Algorithm and Application
Jun-Gi Jang, Jeongyoung Lee, Yong-chan Park, and U Kang
KDD, 2023, Long Beach, CA, USA  ·  [paper]  ·  [code]

Large-Scale Data Mining
DPar2: Fast and Scalable PARAFAC2 Decomposition for Irregular Dense Tensors
Jun-Gi Jang and U Kang
ICDE, 2022, Virtual Event  ·  Best Paper Honorable Mention  ·  [paper]  ·  [homepage & code]

Query-Aware Data Systems
Fast and Memory-Efficient Tucker Decomposition for Answering Diverse Time Range Queries
Jun-Gi Jang and U Kang
KDD, 2021, Virtual Event  ·  Best Research Paper  ·  [paper]  ·  [homepage & code]

Applied AI
Accurate Open-set Recognition for Memory Workload
Jun-Gi Jang, Sooyeon Shim, Vladimir Egay, Jeeyong Lee, Jongmin Park, Suhyun Chae, and U Kang
TKDD, 2023  ·  [paper]  ·  [code]

Positions

Sep. 2025 – Present
Assistant Professor, Electrical Engineering and Computer Science, DGIST
Aug. 2023 – Aug. 2025
Postdoctoral Researcher, Siebel School of Computing and Data Science, UIUC
Mar. 2023 – Aug. 2023
Postdoctoral Researcher, Computer Science and Engineering, SNU
Jul. 2020 – Aug. 2020
Research Intern, Hyperconnect

Education

Mar. 2017 – Feb. 2023
Seoul National University
Ph.D. in Computer Science and Engineering
Mar. 2010 – Feb. 2017
Seoul National University
BSc in Mechanical & Aerospace Engineering
(Double Major: Computer Science and Engineering)

Awards

Fellowships

  • Postdoctoral Fellowship Program, National Research Foundation of Korea. 2023 – 2024
  • Future Gauss Lecture Program, Gauss Labs. Feb. 2022
  • Naver Ph.D. Fellowship, Naver. Dec. 2021
  • Qualcomm Innovation Fellowship, Qualcomm. Nov. 2021
  • Yulchon AI Star Fellowship, Yulchon Foundation, Nongshim Group. Sep. 2021

Professional Service

Area Chair: KDD 2026

Program Committee / Reviewer: KDD, WWW, CIKM, SDM, AAAI, TKDE, KAIS, TPDS, and others

Workshop Organizing Committees: Interplay Between Classical Tensor Methods And Foundation Models, 2026

Full Publication List

Click to expand
DuET: Dual-View Tensor-to-Topology Spectral Adapter for Enhancing Sparse Tensor Factorization
Jun-Gi Jang, Jingrui He, Andrew J Margenot, Hanghang Tong
KDD, 2026, Jeju, South Korea. (To Appear)
ProgNet: Program-Grounded Evidence Composition for Interpretable Graph Classification
Minseok Jeon, Seunghyun Park, Jun-Gi Jang
KDD, 2026, Jeju, South Korea. (To Appear)
Fast and Accurate Domain Adaptation for Irregular and Regular Tensor Decomposition
Junghun Kim, Ka Hyun Park, Jun-Gi Jang, and U Kang
IEEE Transactions on Knowledge and Data Engineering (TKDE), 2026.
Improving Group Fairness in Tensor Completion via Imbalance Mitigating Entity Augmentation
Dawon Ahn*, Jun-Gi Jang*, Evangelos E. Papalexakis (*equal contribution)
PAKDD, 2025, Sydney, Australia.
TUCKET: A Tensor Time Series Data Structure for Efficient and Accurate Factor Analysis over Time Ranges
Ruizhong Qiu*, Jun-Gi Jang*, Xiao Lin, Lihui Liu, Hanghang Tong (*equal contribution)
VLDB 2025, London, United Kingdom.
Compact Lossy Compression of Tensors via Neural Tensor-Train Decomposition
Taehyung Kwon, Jihoon Ko, Jinhong Jung, Jun-Gi Jang, and Kijung Shin
Knowledge and Information Systems (KAIS), Springer, 2024.
Fast and Accurate PARAFAC2 Decomposition for Time Range Queries on Irregular Tensors
Jun-Gi Jang, Yong-chan Park, and U Kang
CIKM, 2024, Boise, Idaho, USA.
Fast and Accurate Domain Adaptation for Irregular Tensor Decomposition
Junghun Kim, Ka Hyun Park, Jun-Gi Jang, and U Kang
KDD, 2024, Barcelona, Spain.
Compact Decomposition of Irregular Tensors for Data Compression: From Sparse to Dense to High-Order Tensors
Taehyung Kwon, Jihoon Ko, Jinhong Jung, Jun-Gi Jang, and Kijung Shin
KDD, 2024, Barcelona, Spain.
Fast and Accurate Dual-Way Streaming PARAFAC2 for Irregular Tensors - Algorithm and Application
Jun-Gi Jang, Jeongyoung Lee, Yong-chan Park, and U Kang
KDD, 2023, Long Beach, CA, USA.
Accurate Open-set Recognition for Memory Workload
Jun-Gi Jang, Sooyeon Shim, Vladimir Egay, Jeeyong Lee, Jongmin Park, Suhyun Chae, and U Kang
ACM Transactions on Knowledge Discovery from Data (TKDD), 2023.
Fast and accurate interpretation of workload classification model
Sooyeon Shim, Doyeon Kim, Jun-Gi Jang, Suhyun Chae, Jeeyong Lee, and U Kang
PLOS ONE, 2023.
Accurate Bundle Matching and Generation via Multitask Learning with Partially Shared Parameters
Hyunsik Jeon, Jun-Gi Jang, Taehun Kim, and U Kang
PLOS ONE, 2023.
Falcon: lightweight and accurate convolution based on depthwise separable convolution
Jun-Gi Jang*, Chun Quan*, Hyun Dong Lee, and U Kang (*equal contribution)
Knowledge and Information Systems (KAIS), Springer, 2023.
Accurate PARAFAC2 Decomposition for Temporal Irregular Tensors with Missing Values
Jun-Gi Jang, Jeongyoung Lee, Jiwon Park, and U Kang
IEEE BigData, 2022, Osaka, Japan.
DPar2: Fast and Scalable PARAFAC2 Decomposition for Irregular Dense Tensors
Jun-Gi Jang and U Kang
ICDE, 2022, Virtual Event.   Best Paper Award, Honorable Mention
Static and Streaming Tucker Decomposition for Dense Tensors
Jun-Gi Jang and U Kang
ACM Transactions on Knowledge Discovery from Data (TKDD), Oct., 2022. (Extended version of C2)
Large-scale Tucker Tensor Factorization for Sparse and Accurate Decomposition
Jun-Gi Jang*, Moonjeong Park*, Jongwuk Lee, and Lee Sael (*equal contribution)
The Journal of Supercomputing, May, 2022. (Extended version of C3)
Finding Key Structures in MMORPG Graph with Hierarchical Graph Summarization
Jun-Gi Jang, Chaeheum Park, Changwon Jang, Geonsoo Kim, and U Kang
ACM Transactions on Knowledge Discovery from Data (TKDD), Feb., 2022.
Fast and Memory-Efficient Tucker Decomposition for Answering Diverse Time Range Queries
Jun-Gi Jang and U Kang
KDD, 2021, Virtual Event.   Best Paper Award, Best Research Paper
Fast and Accurate Partial Fourier Transform for Time Series Data
Yong-chan Park, Jun-Gi Jang, and U Kang
KDD, 2021, Virtual Event.
VeST: Very Sparse Tucker Factorization of Large-Scale Tensors
Moonjeong Park*, Jun-Gi Jang* and Lee Sael (*equal contribution)
BigComp, 2021, Jeju Island, Korea.   Best Paper Award, 1st Place
Time-aware tensor decomposition for sparse tensors
Dawon Ahn, Jun-Gi Jang, and U Kang
Machine Learning, Sep. 2021.
D-Tucker: Fast and Memory-Efficient Tucker Decomposition for Dense Tensors
Jun-Gi Jang and U Kang
ICDE, 2020, Dallas, Texas, USA.
S3CMTF: Fast, accurate, and scalable method for incomplete coupled matrix-tensor factorization
Dongjin Choi, Jun-Gi Jang, and U Kang
PLOS ONE, 2019.
High-Performance Tucker Factorization on Heterogeneous Platforms
Sejoon Oh, Namyong Park, Jun-Gi Jang, Lee Sael, and U Kang
IEEE Transactions on Parallel and Distributed Systems (TPDS), Apr. 2019.
Zoom-SVD: Fast and Memory Efficient Method for Extracting Key Patterns in an Arbitrary Time Range
Jun-Gi Jang, Dongjin Choi, Jinhong Jung, and U Kang
CIKM, 2018, Turin, Italy.