Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems


Citar

Texto integral

Acesso aberto Acesso aberto
Acesso é fechado Acesso está concedido
Acesso é fechado Somente assinantes

Resumo

A new version of accelerated gradient descent is proposed. The method does not require any a priori information on the objective function, uses a linesearch procedure for convergence acceleration in practice, converge according to well-known lower bounds for both convex and nonconvex objective functions, and has primal-dual properties. A universal version of this method is also described.

Sobre autores

S. Guminov

Moscow Institute of Physics and Technology

Autor responsável pela correspondência
Email: sergey.guminov@phystech.edu
Rússia, Dolgoprudnyi, Moscow oblast, 141700

Yu. Nesterov

Center for Operations Research and Econometrics (CORE), Catholic University of Louvain; National Research University Higher School of Economics

Email: sergey.guminov@phystech.edu
Bélgica, Louvain-la-Neuve; Moscow, 101000

P. Dvurechensky

Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
Rússia, Moscow, 127051

A. Gasnikov

Moscow Institute of Physics and Technology; Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
Rússia, Dolgoprudnyi, Moscow oblast, 141700; Moscow, 127051

Arquivos suplementares

Arquivos suplementares
Ação
1. JATS XML

Declaração de direitos autorais © Pleiades Publishing, Ltd., 2019