Genghan Zhang
![]() |
PhD Student Department of Computer Science Stanford University Office: 464 Gates zgh23 [at] stanford [dot] edu Curriculum Vitae |
I am a computer science PhD student at Stanford University advised by Professor Kunle Olukotun. I also worked with Professor Fredrik Kjolstad on sparse tensor algebra compilation and Professor Azalia Mirhoseini on efficient sparse large language models.
Research: My research interests lie in better programming models and systems for domain-specific architectures. I am also interested in optimizing GPU kernels for emerging applications, including sparse and recurrent neural networks.
I graduated from Tsinghua University in 2023 with a bachelor degree in Eletronic Engineering. At Tsinghua, I did research at NICS-EFC Lab on effcient sparse tensor algebra for GPU and IDEAL Lab on kernel architecture search.
Selected Publications
Adaptive Self-improvement LLM Agentic System for ML Library Development Genghan Zhang, Weixin Liang, Olivia Hsu, and Kunle Olukotun ICLR 2025 Workshop on Reasoning and Planning for Large Language Models; ICLR 2025 DL4C Workshop, May 2025 ICLR 2025 DL4C Workshop Best Paper Award (2/63) |
|
Compilation of Modular and General Sparse Workspaces Genghan Zhang, Olivia Hsu, and Fredrik Kjolstad Conference on Programming Language Design and Implementation (PLDI), June 2024 |
![]() |
Sgap: Towards Efficient Sparse Tensor Algebra Compilation for GPU Genghan Zhang, Yuetong Zhao, Yanting Tao, Zhongming Yu, Guohao Dai, Sitao Huang, Yuan Wen, Pavlos Petoumenos, and Yu Wang CCF Transactions on High Performance Computing, 2023 |
|
CATS: Context-Aware Thresholding for Sparsity in Large Language Models Donghyun Lee, Jeyong Lee, Genghan Zhang, Mo Tiwari, and Azalia Mirhoseini First Conference on Language Modeling, October 2024 |
|
Blogs
Adaptive Self-improvement LLM Agentic System for ML Library Development Genghan Zhang , Weixin Liang , Olivia Hsu , and Kunle Olukotun March 01, 2025 |