报告人:张进 副教授 南方科技大学数学系/深圳国家应用数学中心
报告时间:2024年10月25日(周五)下午 2:00-3:00
报告地点:东南大学九龙湖校区计算机楼513室
报告摘要:Gradient methods have become mainstream techniques for Bi-Level Optimization (BLO) in learning fields. The validity of existing works heavily rely on either a restrictive Lower-Level Strong Convexity (LLSC) condition or on solving a series of approximation subproblems with high accuracy or both. In this work, by averaging the upper and lower level objectives, we propose a single loop Bi-level Averaged Method of Multipliers (slBAMM) for BLO that is simple yet efficient for large-scale BLO and gets rid of the limited LLSC restriction. We further provide non-asymptotic convergence analysis of sl-BAMM towards KKT stationary points, and the comparative advantage of our analysis lies in the absence of strong gradient boundedness assumption, which is always required by others. Thus our theory safely captures a wider variety of applications in deep learning, especially where the upper-level objective is quadratic w.r.t. the lower-level variable. Experimental results demonstrate the superiority of our method.
报告人简介:张进,南方科技大学数学系/深圳国家应用数学中心副教授,博士毕业于加拿大维多利亚大学,致力于最优化理论和应用研究,代表性成果发表在《Math. Program.》、《SIAM J. Optim.》、《Math. Oper. Res.》、《SIAM J. Numer. Anal.》、《J. Mach. Learn. Res.》、《IEEE TPAMI》,以及ICML、NeurIPS、ICLR等有重要影响力的最优化、计算数学、机器学习期刊与会议上。研究成果获得中国运筹学会青年科技奖、广东省青年科技创新奖,主持国家自然科学基金优青、天元重点、面上项目、广东省自然科学基金杰青项目、深圳市科技创新培养人才优青项目、以及科技部重点研发计划“数学与应用数学”专项课题。