Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems
- 作者: Guminov S.V.1, Nesterov Y.E.2,3, Dvurechensky P.E.4, Gasnikov A.V.1,4
-
隶属关系:
- Moscow Institute of Physics and Technology
- Center for Operations Research and Econometrics (CORE), Catholic University of Louvain
- National Research University Higher School of Economics
- Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
- 期: 卷 99, 编号 2 (2019)
- 页面: 125-128
- 栏目: Mathematics
- URL: https://journals.rcsi.science/1064-5624/article/view/225638
- DOI: https://doi.org/10.1134/S1064562419020042
- ID: 225638
如何引用文章
详细
A new version of accelerated gradient descent is proposed. The method does not require any a priori information on the objective function, uses a linesearch procedure for convergence acceleration in practice, converge according to well-known lower bounds for both convex and nonconvex objective functions, and has primal-dual properties. A universal version of this method is also described.
作者简介
S. Guminov
Moscow Institute of Physics and Technology
编辑信件的主要联系方式.
Email: sergey.guminov@phystech.edu
俄罗斯联邦, Dolgoprudnyi, Moscow oblast, 141700
Yu. Nesterov
Center for Operations Research and Econometrics (CORE), Catholic University of Louvain; National Research University Higher School of Economics
Email: sergey.guminov@phystech.edu
比利时, Louvain-la-Neuve; Moscow, 101000
P. Dvurechensky
Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
Email: sergey.guminov@phystech.edu
俄罗斯联邦, Moscow, 127051
A. Gasnikov
Moscow Institute of Physics and Technology; Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
Email: sergey.guminov@phystech.edu
俄罗斯联邦, Dolgoprudnyi, Moscow oblast, 141700; Moscow, 127051
补充文件
