Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

A new version of accelerated gradient descent is proposed. The method does not require any a priori information on the objective function, uses a linesearch procedure for convergence acceleration in practice, converge according to well-known lower bounds for both convex and nonconvex objective functions, and has primal-dual properties. A universal version of this method is also described.

About the authors

S. V. Guminov

Moscow Institute of Physics and Technology

Author for correspondence.
Email: sergey.guminov@phystech.edu
Russian Federation, Dolgoprudnyi, Moscow oblast, 141700

Yu. E. Nesterov

Center for Operations Research and Econometrics (CORE), Catholic University of Louvain; National Research University Higher School of Economics

Email: sergey.guminov@phystech.edu
Belgium, Louvain-la-Neuve; Moscow, 101000

P. E. Dvurechensky

Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
Russian Federation, Moscow, 127051

A. V. Gasnikov

Moscow Institute of Physics and Technology; Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
Russian Federation, Dolgoprudnyi, Moscow oblast, 141700; Moscow, 127051

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2019 Pleiades Publishing, Ltd.