Upcoming Events
CSE Faculty Candidate Seminar - Chenfeng Xu

Name: Chenfeng Xu, Ph.D. candidate from the University of California, Berkeley
Date: Thursday, February 27, 2025 at 11:00 am
Location: Scheller College of Business, Room 202 (Google Maps link)
Link: The recording of this in-person seminar will be uploaded to CSE's MediaSpace
Coffee and snacks provided!
Title: Efficient Machine Learning with Structures: Inference, Training, and Application
Abstract: Generative models are transforming artificial intelligence, yet the computational demands for both inference and training are rapidly outpacing available resources. My research boosts efficiency in inference and training by harnessing structured approaches in computation and learning. Beyond improving efficiency, I translate my work into real-world intelligent systems—from mobile devices to robots—to enable innovative applications such as AI design and cross-embodiment transfer.
In this talk, I will present my research philosophy centered on finding structures, computing with structures, and learning with structures. First, I will explain how I capture and reorganize structural patterns in system implementations, optimizing inference-intensive diffusion models to better leverage high-performance hardware. My streaming systems for diffusion models fully exploit hardware capacity, enabling rapid image and video generation and demonstrating applications in visual design, robotics, and everyday AI utilities.
Next, I will reveal the structural patterns within algorithms and discuss how to reorganize them without altering the original system workload. My approach to restructuring large language model (LLM) weights offers a hardware-flexible alternative to conventional quantization and pruning techniques, providing a new solution for LLM compression. I will also demonstrate how this method enhances inference efficiency in vision-language models (VLMs) and vision-language-action systems for robotics.
Finally, inspired by principles from physics, I will illustrate how I convert unstructured optimization in diffusion models into a structured process—achieving up to 3× faster training. I will show how this technique generalizes across a wide range of applications and models. I will conclude by outlining my vision for discovering structured representations for data efficiency and scaling structured computation to bridge the gap from individual instances to large-scale AI systems.
Bio: Chenfeng Xu is a Ph.D. candidate at UC Berkeley, advised by Kurt Keutzer and Masayoshi Tomizuka. His research focuses on efficient machine learning and intelligent systems. He has accelerated computational AI applications in generative AI and embodied AI. His work has been recognized as notable papers and oral presentations at conferences such as ICLR, CoRL, and ICRA etc. Chenfeng was awarded the Qualcomm Innovation Fellowship, and his research has been supported by Google Generative AI Grant, Meta, NVIDIA, Toyota Research Institute, Qualcomm, and Stellantis etc. His work has been widely adopted in industry, powering platforms such as Meta’s efficient deep learning toolkit for mobile devices, Baidu's PaddlePaddle toolkit, and LivePeer’s video infrastructure.
Event Details
Media Contact
Arlene Washington-Capers
arlene.washington@cc.gatech.edu
EVENTS BY SCHOOL & CENTER
School of Computational Science and Engineering
School of Interactive Computing
School of Cybersecurity and Privacy
Algorithms and Randomness Center (ARC)
Center for 21st Century Universities (C21U)
Center for Deliberate Innovation (CDI)
Center for Experimental Research in Computer Systems (CERCS)
Center for Research into Novel Computing Hierarchies (CRNCH)
Constellations Center for Equity in Computing
Institute for People and Technology (IPAT)
Institute for Robotics and Intelligent Machines (IRIM)