An Accelerated Stochastic Mirror Descent Method

创建日期:  2024/09/23     浏览次数:   返回

刊名:Journal of the Operations Research Society of China 2024年第3期,第549—571页

题名:An Accelerated Stochastic Mirror Descent Method

作者:Bo-Ou Jiang1,2 · Ya-Xiang Yuan1

单位:1 LSEC, ICMSEC, AMSS, Chinese Academy of Sciences, Beijing 100190, China

2 Department of Mathematics, University of Chinese Academy of Sciences, Beijing 100049, China

摘要:Driven by large-scale optimization problems arising from machine learning, the development of stochastic optimization methods has witnessed a huge growth. Numerous types of methods have been developed based on vanilla stochastic gradient descent method. However, for most algorithms, convergence rate in stochastic setting cannot simply match that in deterministic setting. Better understanding the gap between deterministic and stochastic optimization is the main goal of this paper. Specifically, we are interested in Nesterov acceleration of gradient-based approaches. In our study, we focus on acceleration of stochastic mirror descent method with implicit regularization property. Assuming that the problem objective is smooth and convex or strongly convex, our analysis prescribes the method parameters which ensure fast convergence of the estimation error and satisfied numerical performance.

关键词:Large-scale optimization · Variance reduction · Mirror descent · Acceleration · Independent sampling · Importance sampling

全文链接: https://doi.org/10.1007/s40305-023-00492-2

上一条:互助、组织化与承认:劳动实践中志愿服务的多维动力机制研究——以自雇卡车司机为例

下一条:“数字泰勒主义”影响下的外卖平台“计件工资制”研究

 版权所有 © 上海大学   沪ICP备09014157   沪公网安备31009102000049号  地址:上海市宝山区上大路99号    邮编:200444   电话查询
 技术支持:上海大学信息化工作办公室   联系我们  

办公地址:上海市宝山区南陈路333号上海大学东区三号楼二楼   联系电话:021-66132736