报告人:Yunwen Lei 助理教授 香港大学
时间:2024年12月24日(周二)下午14:00-15:00
地点:#腾讯会议 576 320 840
报告摘要:Recent developments of stochastic optimization often suggest biased gradient estimators to improve either the robustness, communication efficiency or computational speed. Representative biased stochastic gradient methods (BSGMs) include Zeroth-order stochastic gradient descent (SGD), Clipped-SGD and SGD with delayed gradients. In this talk, we present the first framework to study the stability and generalization of BSGMs for convex and smooth problems. We apply our general result to develop the first stability bound for Zeroth-order SGD with reasonable step size sequences, and the first stability bound for Clipped-SGD. While our stability analysis is developed for general BSGMs, the resulting stability bounds for both Zeroth-order SGD and Clipped-SGD match those of SGD under appropriate smoothing/clipping parameters.
报告人简介:Yunwen Lei is currently an Assistant Professor at the Department of Mathematics, The University of Hong Kong. His main research interests include machine learning, statistical learning theory and stochastic optimization. He has published papers in prestigious journals and conference proceedings, including IEEE TIT, JMLR, COLT, ICLR, ICML and NeurIPS.