Vision, Learning, and Acceleration (VLA-Lab) Seminar Series - VLA-Lab Seminar | Finding Good, Cheap and Transferrable Sparsity
Supporting the below United Nations Sustainable Development Goals:支持以下聯合國可持續發展目標:支持以下联合国可持续发展目标:
Sparsity is commonly produced from the model compression (i.e., pruning), which eliminates unnecessary parameters. However, identifying high-quality sparse patterns without sacrificing performance is extremely expensive. Fortunately, to amortize the cost, our recent work demonstrates one kind of sparsity capable of universally transferring across diverse downstream tasks. It provides an efficient and effective "copy" of its dense counterpart especially valuable for enormous pre-trained models. Beyond this, sparsity also serves as an important tool for modeling the underlying low dimensionality of NNs, for understanding their generalization, implicit regularization, expressivity, and robustness. In this talk, I will first discuss, practically, why the sparsity is treasured and how to find it. Then I will present, theoretically, how to understand the universal transferability of our located sparse NNs. Finally, I will describe what is the prospect of exploiting sparsity in future.
Tianlong Chen is currently a forth-year Ph.D. student of Electrical and Computer Engineering at University of Texas at Austin, advised by Dr. Zhangyang (Atlas) Wang. Before coming to UT Asutin, Tianlong received his Bachelor's degree at Univeristy of Science and Technology of China. His research focuses on building efficient, accurate, robust, and automated machine learning systems. Recently, Tianlong is working on extreme sparse neural networks with undamaged trainability, expressivity, and transferability. Tianlong has published more than 60+ papers at top-tier venues (NeurIPS, ICML, ICLR, CVPR, ICCV, ECCV, etc.). Tianlong is a recipient of the 2021 IBM PhD Fellowship Award, 2021 Graduated Dean's Prestigious Fellowship, and 2022 Adobe PhD Fellowship Award. Tianlong has conducted research internships at Google, IBM Research, Facebook Research, Microsoft Research, and Walmart Technology.
Zhiqiang Shen (zhiqiangshen@ust.hk)