Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems
- Autores: Guminov S.V.1, Nesterov Y.E.2,3, Dvurechensky P.E.4, Gasnikov A.V.1,4
-
Afiliações:
- Moscow Institute of Physics and Technology
- Center for Operations Research and Econometrics (CORE), Catholic University of Louvain
- National Research University Higher School of Economics
- Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
- Edição: Volume 99, Nº 2 (2019)
- Páginas: 125-128
- Seção: Mathematics
- URL: https://journals.rcsi.science/1064-5624/article/view/225638
- DOI: https://doi.org/10.1134/S1064562419020042
- ID: 225638
Citar
Resumo
A new version of accelerated gradient descent is proposed. The method does not require any a priori information on the objective function, uses a linesearch procedure for convergence acceleration in practice, converge according to well-known lower bounds for both convex and nonconvex objective functions, and has primal-dual properties. A universal version of this method is also described.
Sobre autores
S. Guminov
Moscow Institute of Physics and Technology
Autor responsável pela correspondência
Email: sergey.guminov@phystech.edu
Rússia, Dolgoprudnyi, Moscow oblast, 141700
Yu. Nesterov
Center for Operations Research and Econometrics (CORE), Catholic University of Louvain; National Research University Higher School of Economics
Email: sergey.guminov@phystech.edu
Bélgica, Louvain-la-Neuve; Moscow, 101000
P. Dvurechensky
Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
Email: sergey.guminov@phystech.edu
Rússia, Moscow, 127051
A. Gasnikov
Moscow Institute of Physics and Technology; Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
Email: sergey.guminov@phystech.edu
Rússia, Dolgoprudnyi, Moscow oblast, 141700; Moscow, 127051
Arquivos suplementares
