Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems


Дәйексөз келтіру

Толық мәтін

Ашық рұқсат Ашық рұқсат
Рұқсат жабық Рұқсат берілді
Рұқсат жабық Тек жазылушылар үшін

Аннотация

A new version of accelerated gradient descent is proposed. The method does not require any a priori information on the objective function, uses a linesearch procedure for convergence acceleration in practice, converge according to well-known lower bounds for both convex and nonconvex objective functions, and has primal-dual properties. A universal version of this method is also described.

Авторлар туралы

S. Guminov

Moscow Institute of Physics and Technology

Хат алмасуға жауапты Автор.
Email: sergey.guminov@phystech.edu
Ресей, Dolgoprudnyi, Moscow oblast, 141700

Yu. Nesterov

Center for Operations Research and Econometrics (CORE), Catholic University of Louvain; National Research University Higher School of Economics

Email: sergey.guminov@phystech.edu
Бельгия, Louvain-la-Neuve; Moscow, 101000

P. Dvurechensky

Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
Ресей, Moscow, 127051

A. Gasnikov

Moscow Institute of Physics and Technology; Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
Ресей, Dolgoprudnyi, Moscow oblast, 141700; Moscow, 127051

Қосымша файлдар

Қосымша файлдар
Әрекет
1. JATS XML

© Pleiades Publishing, Ltd., 2019