Department of Industrial Engineering & Decision Analytics [Joint IEDA/ISOM] seminar - A Descent-Oriented Subgradient Method for Nonsmooth Optimization
Supporting the below United Nations Sustainable Development Goals:支持以下聯合國可持續發展目標:支持以下联合国可持续发展目标:
In nonsmooth optimization, the absence of gradient continuity presents a fundamental challenge to the design of convergent descent methods based merely on first-order information. Classical methods, such as bundle techniques and gradient sampling, attempt to overcome this by aggregating subgradients from a local neighborhood. However, these approaches are often computationally burdensome or lack deterministic guarantees. In this talk, I will present a unifying principle behind these methods and introduce a general framework for provably convergent descent algorithms under a solely first-order oracle. Central to this framework is a simple yet powerful technique called subgradient regularization, which generates stable descent directions for a class of marginal functions using information at only a single point, effectively eliminating the need for neighborhood sampling. Notably, when applied to the composition of a convex function with a smooth map, our approach recovers the classical prox-linear method and offers a new dual perspective. I will conclude with numerical experiments showcasing the method’s effectiveness on challenging nonsmooth problems, such as the minimization of Nesterov’s Chebyshev-Rosenbrock function.
Hanyang Li is a PhD student in the Department of Industrial Engineering and Operations Research at UC Berkeley. His research focuses on theory and algorithms for nonsmooth optimization. He received his Bachelor’s degree in Mathematics from the University of Science and Technology of China in 2021 and a Master’s degree from the University of Minnesota in 2023. His work has been published in SIAM Journal on Optimization and Mathematics of Operations Research.