刊名:Journal of the Operations Research Society of China 2024年第3期,第549—571页
题名:An Accelerated Stochastic Mirror Descent Method
作者:Bo-Ou Jiang1,2 · Ya-Xiang Yuan1
单位:1 LSEC, ICMSEC, AMSS, Chinese Academy of Sciences, Beijing 100190, China
2 Department of Mathematics, University of Chinese Academy of Sciences, Beijing 100049, China
摘要:Driven by large-scale optimization problems arising from machine learning, the development of stochastic optimization methods has witnessed a huge growth. Numerous types of methods have been developed based on vanilla stochastic gradient descent method. However, for most algorithms, convergence rate in stochastic setting cannot simply match that in deterministic setting. Better understanding the gap between deterministic and stochastic optimization is the main goal of this paper. Specifically, we are interested in Nesterov acceleration of gradient-based approaches. In our study, we focus on acceleration of stochastic mirror descent method with implicit regularization property. Assuming that the problem objective is smooth and convex or strongly convex, our analysis prescribes the method parameters which ensure fast convergence of the estimation error and satisfied numerical performance.
关键词:Large-scale optimization · Variance reduction · Mirror descent · Acceleration · Independent sampling · Importance sampling
全文链接: https://doi.org/10.1007/s40305-023-00492-2